Jump to content

Arnot

Member
  • Posts

    616
  • Joined

  • Last visited

Everything posted by Arnot

  1. I did actually see such a narrow boat built at or near Pete Wyatts Canal Cruising boatyard in Stone although it was built by a rather strange chap. As far as I can remember, shortly before he completed it, the boys in blue pounced and carted him off for an extended holiday at Her Majesty's pleasure and I never found out what happened to the boat. In his case there was a humungous six cylinder diesel and a massive motor that together consumed about a third of the length of the boat. I did ask him why he was dong it and never got a sensible reply. My thoughts... 1) Kipor are firmly at the bottom of the quality ladder in generators, fine for the occasional power cut but far from continuously rated. Try Harrington generators for a more robust unit but they are about three times the price. They use the good 'ole Kubota 3 cylinder jobby that with a whiff of a service every year or two seem to go on forever. 2) One way would be to control the generated power by routing it through an industrial inverter which will accept single phase and then convert it into three phase with controllable voltage current and frequency curves. This connected to a three phase motor would give completely variable speed and torque (within the permissible bounds of the motor design) as well as reverse. This is how large industrial motors are driven as the system provides a whole host of benefits. 3) A simpler way, would be to use an engine with a three phase alternator connected directly to a three phase motor providing a synchronous link. The motor can be reversed by merely swapping two of the phases. This system of driving a motor from an alternator is not popular in Europe because it is almost impossible to specify but it is very widely used almost everywhere else because there is very little to go wrong and some such systems have been pumping water in the Australian Outback reliably for over 50 years. I suspect that at lower speeds the torque would be rubbish and that motor heating would be an issue but it might work... Hope this helps. Regards Arnot
  2. Just for what it’s worth a major part of my job is electrical testing mobile low voltage (the official designation of normal mains voltage) installations and I come across this one almost daily. In short, I would suggest that you can rely on the operation of the test button but if you suffer from insomnia here is a more (but not fully) detailed explanation… First, an RCD operates merely on imbalance of current through the live (phase) and neutral, although in practice it will usually be an earth fault that causes one to trip, the relevant imbalance however caused will have the same effect and a lot of mystery tripping stems from a neutral to earth fault in some sort of appliance. The test button on RCD’s does not “just” check the tripping mechanism, it also tests the windings and the actuator coil (if any). For the purposes of the operator using the test button for regular testing is perfectly adequate. There is a reason for this which is that these devices have to also operate in installations where there is no hard reference to earth or where (as in some countries) the supply is not 230VAC but 115+115VAC, i.e. the protective earth is fixed in between two phase connections. This also applies to transformer fed 110VAC supplies that derive some of their inherent safety from any earth to phase voltage being limited to 55VAC. You will possibly have noticed that unlike an RCBO, an RCD does not have an earth connection. The general object of the device is to try to ensure that under no circumstances does the touch voltage exceed 50v and that a large imbalance (or leakage) is disconnected within 40mS. For the purposes of standardisation, it is reckoned that a shock of less than 30mA or for less than 40mS will not be fatal (although it may well bring about a stream of invective) As stated in a later post the electrical systems on boats would ideally be tested once a year but as they are not generally “places of work” they do not fall under the auspices of PUWER 98, which is the main driver for the testing of most vehicle installations. On a full inspection and test the RCD is tested from an external known supply for both the current it trips at and the times at takes to trip under a variety of fault conditions. As well as this on conclusion of the calibration testing the test button is tried as well. If the installation will often be used with an inverter supply it is also tried with the inverter installed. There is a slight snag with testing RCD’s from an inverter supply, which is that frequently the waveform is not a pure sine that it needs to be for a test to give reliable results. In fact, the better calibrated test equipment will just lock out and not allow testing to take place if the voltage, waveform or frequency are outside defined limits or for that matter if there is more than a small harmonic content. I am sure that there must be others but the only inverters that I see that give an output suitable for calibrated testing are Mastervolt and Victron and even then only when they have been wired so that the neutral is referenced to the protective earth. As stated in another post, you can also test an RCD using a resistor between live an earth to generate a 30mA fault current (but please do it carefully with the resistor located inside a plug top) but as far as the RCD is concerned this really only replicates what the test button does. for those who are interested, the value of the resistor should be about 7.5kΩ. There has been some reference to the test button not verifying that a fault to earth will trip it and this is true – but – if the protective earth really is completely isolated then a person touching live and earth would not complete a circuit and thus would not receive a shock, fatal or otherwise. In practice, most small generators and most inverters do have a capacitor between other phase output and the protective earth which will allow enough current for an RCD to trip or in the absence of one for a small amount of fault current to flow which is why the regular use of the test button is highly recommended. If you want to verify the protective earth there are good ways of doing it but these are best left to those with the appropriate expertise and equipment, this sort of testing can be inherently unsafe and should only be carried out very carefully. Please please don't attempt it! Regards Arnot
  3. Thank you Snibble, this is the phenomenon I was trying (inadequately) to describe in an earlier post and my explanation is this; If you draw on paper (or a computer if you are that way gifted) the waveform produced by a three phase stator with a peak of about 20v and then draw a line at the DC voltage that a battery would start charging at (say 13v) then the difference voltage divided by the internal resistance of the battery will give the current. The current multiplied by the difference voltage will give you the power transfer. If you then do another drawing with the peak voltage increased to say 23v (as in battery sensing) and do the same equation you will find that the current and more particularly the power transfer has increased significantly. Now if you do the same two calculations with a single phase alternator, you will find that because of the lower duty cycle of the waveform the difference is even more pronounced and the increase from 40A to 55A becomes pretty much predictable. What you actually did was just what I suggested although the increase on a standard alternator would probably not be so great. Did you notice the more rapid warming of the stator? It is my guess (and I have not done the calculations) that the inductive component of the stator is relatively insignificant compared to the effects of the winding resistance and won’t have much impact on the output. If the inductance did have a marked effect then I would anticipate that the output of alternators would fall off as the shaft speed and hence frequency increased, and it doesn’t seem to on test. Question – did you try the same switching of the sensing point at different speeds? My prediction would be that there would be little or no difference. After all the frequency at 2500rpm would only be about 42Hz, there are very few windings and quite a lot of air gap. Please bear in mind that in doing this sort of thing, I don’t take a very academic approach more an empirical one. As long as the effect is explicable and more or less predictable, I prefer to leave the absolute mathematical proof to others… So on this basis I will stick to my rule of thumb that the current output of an alternator is proportional to the difference between the voltage the battery starts to draw current and the peak current divided by the internal resistance of the battery. There are a lot of other linear and non-linear variables involved but even lumped together they are minor. Regards Arnot
  4. Ah hah! that explains it - thanks for the info. That probably means that with a bit of modification the charging voltage can be trimmed as well. Regards Arnot
  5. Further to my last post I just had a look at the Adverk website but could not seem to find any installation instructions or description of the dv/dt algorithm used by their system. However in all fairness since this is commercially sensitive it’s not too much of a surprise. From what I could divine, the Adverk BMS is a relatively sophisticated form of overlay alternator regulation that overrides the original regulator if it’s own algorithm decides that the output voltage should be higher. It does clearly show a sensing lead connected directly to the services battery and thus should be able to correctly control the charging voltage. I suspect that the algorithm is similar to the type used by many multi stage lead acid battery chargers and has bulk, absorption and float voltages. Although in their text they mention temperature compensation I don’t see any sensing system for this and so presume that it is built in to the control unit. This is fine if it is thermally coupled to the batteries otherwise it will just be reacting to the temperature of the control unit which with a) a different location and a different thermal mass, will bear little relationship to the battery state. There is another potential problem here. Some alternators have a regulator system that not only ties the field terminal low to increase the output but can also feed it with a voltage to actively lower the alternator output. If the Adverk BMS is connected to one of these there will occasionally be a conflict. In this case the strongest device will win and the weaker device (or even both of them) may well be permanently damaged. If this happens it may not be apparent and in all fairness may have no practical impact but in a worst case scenario regulation may be completely lost and the whole system could cook. One final thing, I didn’t see any method of trimming the regulation points of the BMS to suit differing battery types but then I didn’t have the instructions so there may be some way of doing this. For example, long periods of 14.4v applied to engine heated cheap flooded cell batteries will shorten their life significantly whereas the same voltage applied to thin plate, pure lead AGM batteries will take an age to get them fully charged. Having said all that, I like the approach of Adverk and a lot of what they say on their site not only makes good sense but also betrays a profound understanding of what is going on. Reading between the lines they have the same difficulty I have found in explaining a complex non-linear relationship in simple terms. Absolutely true, and as an interesting extension of this, there will probably be an optimum for the voltage drop in the wiring between the alternator output and the batteries. Thus although common sense dictates that the thicker the cable the better, this may not actually be the case... One word of warning here, when a diode type split charge is used in this type of control system the additional quantity and duration of the charging current will increase the thermal ouput of the splitter diodes and they may well get very warm. It is thus a good idea to make sure that they are mounted so that the cooling fins are vertical and preferably in some sort of forced air stream. My preference would be for a relay type split charge system and optimised cable resitances to give about 1.0v drop at full alternator output. This is the system I have used for the last thirty something years and it seems to work just fine Regards Arnot
  6. Sorry, I ignored the the bit about the Adverk. Correct me if I am wrong here but the Adverk devices I have come across are not actually regulators in the true sense of the term but a device that overrides the original regulator for a while to boost the charging voltage temporarily. I was looking at a boat the other day that had some wierd intermittent charging fault (needless to say it worked perfectly all the time I was there) with one on and the owner had merely disconnected the wire to the alternator and the original regulator continued working. From what I could see it merely kept the rotor current maxed for longer by keeping the field terminal to earth when the inbuilt regulator would have allowed it to float up. Rather like converting the alternator to a three stage charger but without any feedback loop. What it did not have was any connection to the batteries themselves and thus could have no way of knowing the actual battery voltage. Of course it possibly had not been installed correctly but since the fault did not occur after an hour and the owner did not have flat battery issues I didn't poke around too much. I accept that there may be other Adverk devices and if you let me know what yours is I will try to get some idea of what it does from their website. However, back to your question "what I am wondering is whether the higher rotor current (which is a consequence of the extra volt being applied across it during this bulk charging phase) would result in an increased output current." The answer is yes, given sufficient speed and that the alternator is in constant voltage mode, an increased rotor current would result in higher rotor flux density and thus higher stator voltage or current at a given speed. However this is only one part of the charging current improvement, probably more is gained by the stator peak voltage being allowed to go higher than normal as per the maths bit in a previous post I made. A double whammy... Regards Arnot
  7. Not quite – including a diode splitter will not actually increase the amount of charging current available to the batteries, what it does is fools the regulator into thinking that the battery voltage is higher than it actually is thus causing the regulator to restrict the output of the alternator prematurely and reducing the charging current. I suspect that you are confusing rotor current and output current here… Regards Arnot I think you have it right on the matter of diode voltage drop. The issue of saturation (I presume you mean magnetic flux saturation) is actually almost never a problem in practice, providing the rotor design is good (and it usually is) then it would require significantly more current than could be obtained from a nominal 14.4v supply to saturate the rotor. However as the temperature rises this becomes more of a possibility. Regards Arnot
  8. You are absolutely correct sir! It serves me right for trying to over simplify what is quite a complex relationship in a rush. When the alternator first kicks in and the batteries are both flat and in good condition, they are able to accept more current than the alternator can provide at the regualted voltage, at this stage the alternator is current limited (either by stator design or by RPM) and (as you say) the rotor current is maxed so no more is avaialable regardless of any eletrickery. That's the bit I skipped over - well spotted... However, in practice, this stage generally only lasts for a very short time given even moderate engine speed and once the current the batteries need to support the regulated voltage falls below the maximum of the alternator this is where the problems generally occur. At this stage, the regulator kicks in and the alternator starts to operate in constant voltage mode and here sophisticated regualtion systems can help. On a conventional alternator, the regulator measures the voltage available on the diode trio and operates on the assumption that this is the required output, but, as has been commented on elswhere, the voltage drop on the main rectifier under load is significantly higher than that of the diode trio and at higher currents there is a significant voltage drop across the wire from the alternator to the batteries, dismissing for the moment any splitter diodes or FET's. In consequence, in the early stages of the constant voltage mode when the output current is not at maximum but somewhere near it, the voltage the regulator "sees" is quite a lot higher than the actual battery voltage and it holds back the available current unnecessarily. Alternatively, if there is a regulation system that "sees" the true battery voltage and thus compensates for the diode error and cable losses the full output of the alternator can be maintained for far longer and during part of the charging cycle actually increased over and above the current that a conventional system would be able to produce (as described in my last post). Please forgive me but the compex interaction of non linear variables that takes place when an engine with varying speed drives an alternator with varying output into batteries with varying and non-linear charge acceptance characteristics pretty much defies any verbal explanation (for me at least) and I have probably not made the topic much clearer. However the effect can be amply demonstrated in practice. If a regulator with remote sensing is fitted to an engine and made to charge a battery bank that is fairly flat. Once the initial constant current mode has finished if the ouput current is monitored as the sensing position is changed from the diode trio output to the battery positive, a significant increase in the charging current will be observed. In addition it will be seen that the charging voltage on the battery terminals both rises to the actual regulated voltage and it becomes more constant at varying engine revs. CAUTION - do not try this at home! to do this demonstraton you must have some resistor networks in place to prevent the sensing wire ever being open circuit. If this happens the alternator can run unregulated and if the batteries are reasonably well charged and/or the engine reves are high the voltage can achive levels that would damage equipment. I hope that this clears it up slightly. Regards Arnot
  9. You are both correct and incorrect here… When an alternator is charging a battery that is very flat, the current output will be limited by the regulator providing maximum field current. However this current is related to the resistance of the rotor winding and the regulated voltage. If you increase the regulated voltage, you get more current and the proportional increase in flux density. The current output of an alternator is (simplistically) a function of the impedance of the windings and the difference between the peak voltage of the AC waveform and the nominal voltage of the battery. I.e. (again simplistically) if the stator impedance is 0.02Ω and the nominal voltage of the battery is 12.6v and the charging voltage is 14.4 then the output is 1.8/0.02 = 90A. If then the regulated voltage is increased to 15.4, then the output is 2.8/0.02 = 140A. Obviously, 15.4v is too high to charge most batteries safely – but – if the alternator is battery sensed, then the voltage drop in the wiring to the battery will (especially at high current) probably be about 1.0v so although the alternator output is 15.4 then the battery only sees 14.4v. As the current falls, so the voltage drop on the wiring also falls and the regulator compensates for this by lowering the alternator output voltage to maintain the battery charging voltage at 14.4. Obviously with a regulator that can be adjusted, the battery charging voltage can be set wherever you want and there are a number of factors that dictate this such as what type of batteries are being used, the temperature of the batteries and how long they will typically remain on charge. Gel and AGM batteries will usually cope with a higher charging voltage, higher temperatures usually require a reduction of the charging voltage and if the engine is usually only run for short periods of time, the charging voltage can usually be increased for a while at least. A lot of this depends specifically on the battery chemistry and construction and most battery manufacturers will provide this sort of information. In a graphical sense, if you compare the output curves of an optimised battery sensed system and a conventional machine sensed system you will see that the output of the battery sensed system rises much more quickly with increased speed although ultimately the output is not much higher. It is this change in the slope of the output characteristic that leads to the apparently disproportionate increase in output al lower speeds. I am just constructing a website to demonstrate this more clearly and will post the URL when it is more complete for those who are interested. The problem (as I see it) stems from the fact that boats tend to be fitted with alternators that are designed to be used in vehicles that generally operate at much higher and more variable engine speeds. Narrow Boats on the other hand rarely operate at much above minimum speed for any length of time and are rarely taken anywhere near maximum speed for more than a second or two. Is there a down side? Yes possibly. When the alternator is running at fairly low speed at maximum output, the increase in charging current is matched by an increase in the heat generated by the stator windings and the alternator will tend to run very hot. If the engine compartment cooling is poor or the rotor bearings are poor this will lead to accelerated failure. I often see alternators with a fan designed for the opposite direction of rotation to the drive supplied and in this case the cooling is drastically reduced. With large heavy duty alternators such as Leece Neville and some of the Delco range to name but two, this increased heat generation is not a problem unless they incorrectly installed but when a normal car alternator such as the A115 and A127 are used if there is a high domestic load then the life of the alternator will probably be reduced by about 50%. The upside is that the battery life will be significantly extended and almost all the electrical equipment in the boat will operate more efficiently and reliably. You pays your money and takes your choice. There is no particular difference between an integral regulator and an external regulator in an electronic sense, it is just that a lot of alternators cannot be modified to compensated external sensing other than by using an external type. They both do the same job its just that usually an integral system is cheaper, neater, easier to install and with less wiring and connections to fail is statistically more reliable. The loss of the blue LED’s is a more serious issue… That’s probably enough for now but if you have any more questions please don’t hesitate to ask. Regards Arnot
  10. Similar in its effect, just quite a bit easier to do ( and sans blue LED,s ) it is in effect an adjustable, battery sensing internal regulator. The kit consists of a new regulator brush box that just replaces the original. The replacement item has another wire that has to be connected to the positive of the battery that is used most (probably the domestics) through a calibration resitance to set the desired voltage. It can even be made variable at the panel if you wish but I wouldn't reccomend it. This then battery senses the charging system and at the same time allows the regulating voltage to be set to whatever is appropriate. Very useful if you use AGM or Gel batteries. Typically the ultimate output of the alternator only rises by about 10-20% but the real benefit comes at low engine speeds. At 50 to 100% above the speed where the alternator commences charging the increase in current can be over 100%. There is a technical justification for this but I won't bore you with it unless you really want to know. I first developed this system in the 70's when I did a couple of boats with seven 2V fork lift truck cells to provide about 14.5v even with the engine not running and plenty of capacity. This was to get round the problems of voltage drop for the early pumps fitted at the front of a 70ft hull. The regualtion system allowed me to run the 160A alternator at 16.8v charging voltage at the battery for a quick recharge and good control, I seem to remember it increased the current output to about 210A but it was thirty years ago... At the moment I only have an updated version for the A115 and A127 but am working on varieties for the Iskra unit fitted to Beta engines and both 12v and 24v for the Leece Neville bigguns. By the way, if you really want loads of output at low engine speeds on one of the Gardners, Rustons or other vintage engines, then Leece Neville is the way to go. Just a bit expensive. Hope this helps... Regards Arnot
  11. I possibly have a solution for you here, I have a replacement regulator system for these alternators that can be retrofitted to a working alternator which provides both battery sensing and the ability to set the charging voltage between 11.5v and about 16.5v quite simply. It is system that I install professionally in Staffordshire and don't normally sell as a kit but I can't think of any reason why I shouldn't. It just requires a bit of electrical/mechanical know how and the ability to solder. If you are interested PM me and I will work out how much a kit would have to be... Regards Arnot
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.