PDA

View Full Version : Decrease in Size of Alternator/Motor with Increase in Frequency


gkraju
30th May 2011, 05:07
In aircraft the frequency of operation of electrical systems used is mostly is 400hz. The reason given is that the size/weight of the equipment reduces as the frequency increases. can any one please explain me the theory behind this with some formula, please do not stop with formula. How and why the size reduces ? i had already checked yahoo answers, but there was no satisfactory reply any where

Checkboard
30th May 2011, 10:41
The voltage coming out of the alternator depends on two variables: the amount of current flowing through the field coil (i.e. the strength of the magnetic field) and the speed at which the alternator’s field is rotating. Increase the speed, and you need fewer windings in the armature for the same power.

If you can match your frequency requirement to the speed the alternator will be rotating in service, then you don't need a step-down gear box either.

(Nice reply ASFKAP - shows your generosity of spirit. :rolleyes:)

max_torque
30th May 2011, 12:09
Starting at the beginning:

Power = speed x force

So, if you want more power, then you either make more "force" or you make the same force at a greater "speed"

(for rotating machinery, power [kW] = ((Torque [Nm] x angular velocity [rpm] )/ 9554)

so 1Nm torque is 0.1kW at 1000rpm, but 1kW at 10,000 rpm


For an electrical machine (motor/generator) the "torque" produced (or absorbed) is proportional to the volume (and hence mass) of "active" material in the machine (copper & iron etc). hence the overall mass and size of the machine depends upon the torque you want to make.

So, to provide more power, it make sense to increase the rotational speed of the device rather than the torque required/produced. (ultimately, speed is then limited by device cooling and mechanical limits due to centrifugal acceleration etc)

Further to this, for electrical power, the torque is proportional to Current[Amperes], and the speed to voltage[Volts]. As losses (and transmission/conversion costs) increase with the square of current (IsquaredR losses) it makes sense to move power around at the highest voltage possible to minimise the current requires (also why for example the National grid uses 400kV !!)

For a conventional rotating field multi phase alternator, the ouput frequecy is fixed by the rotation speed and the number of electrical phases per mechanical revolution.

To avoid the cost / complexity and bulk of performing an intermediate frequency or voltage conversion, it is best to design the generating system to operate at as stable a speed as possible, and to be able to transmit the resulting AC power directly to the power transmission system.


The result of all this, high power = high electrical AC frequency !

Flash2001
30th May 2011, 15:41
Higher frequencies require less iron in motors, transformers etc. for the same current.

After an excellent landing etc...

gkraju
31st May 2011, 02:29
not for home work, just out of curiosity and long pending doubt of mine

Checkboard
1st Jun 2011, 15:33
Several times a year? Really? I hang out here pretty constantly, and for the last 12 years or so, and I haven't seen this one... feel free to post any previous links you can find then.

MurphyWasRight
1st Jun 2011, 16:52
Although speed/torque may be a minor factor I beleive the primary reason for 400cps (cycles per second, used when this system was designed, now known as Hertz) is the saving in iron needed in magnetic functions such as transformers and motors.

The reason iron is reduced is that the inductance required goes down as frequency increases.

The reason that 400hz was chosen is that with available materials avalable at the time hystersis and other losses become unexcetable at higher frequencies.

To contrast some modern switching power supplies operate at well over 1Mhz and use tiny inductors.

Tjosan
1st Jun 2011, 17:05
The reason that 400hz was chosen is that with available materials avalable the time hystersis and other losses become unexcetable at higher frequencies.

Another very important aspect at the old days was weight of electrical equipment, hence less iron, less weight.

syseng68k
2nd Jun 2011, 21:55
Others have answered this pretty well, but a bit more:

The higher the frequency, the less inductance is needed in magnetic
components. Fewer turns of copper and less iron core (alternators,
motors, transformers etc) which means less weight. This was and still is
a critical factor in aviation and every Kg saved is more freight or less
fuel. If you look at the avionics fit for some a/c in the 50's and 60's,
it's surprising that they ever got off the ground, but they just put big
engines in :-).

Higher frequencies have been used in the past, even as early as ww2,
with some post war uk mil aircraft, ie Lightning and Vulcan using 1600
Hz supplies in places. At this frequency, you need fairly exotic and
expensive magnetic materials to keep losses down. Acceptable in the cold
war, but too expensive for commercial use and by that time, 400Hz
supplies in a/c had been around since ww2 at least and there was a whole
infrastructure in terms of components and standardisation.

As an example, a 1kva 50Hz transformer would be quite heavy and take
two hands to lift, while the 400Hz equivalent would fit in the palm of one
hand...

Regards,

Chris