View Full Version : What's the purpose of the cold start injector?

10-09-2010, 03:52 AM
Not asking why it needs more fuel when it's cold, but why didn't Toyota increase the fuel through the regular injectors instead of adding another injector in the intake manifold?

Hipster Lawrence
10-10-2010, 01:05 AM
I'm sure there are lots of reasons. But I think the main one is that an engineer in the 1970's thought it was a good idea and they pretty much stuck with it for the next 30 years. It worked why mess with it right.

10-10-2010, 06:07 AM
The decsion to use a cold start injector is usually based on injector duty cycles, Pulse width modulation, and how much the fuel injectors can flow. At cold start up the fuel injectors may be approaching there duty cycles, so a cold start injector is used to aid in the amount of fuel that is needed at cold start up.

Nowadays technology is more sophisticated so a cold start injector is not usually needed.

10-10-2010, 06:18 AM
My big thing is I don't understand why they didn't just run injectors that would do 75-80% duty cycle to leave some room for a cold start without having to engineer a separate injector in the intake manifold. I would think the cost benefit between just running larger injectors vs adding a 5th, casting it into a manifold, running wires to it and a separate driver would be pretty obvious.

Although you've got to be right, I mean hopefully there's good reason for doing this.

10-10-2010, 05:23 PM
Do not know much about Celica injectors yet - but I do know about PWM control in switch mode power supplies and I would think that the same basic principles as well as limitations would apply.

In either case, you modulate the on-time to provide more or less average voltage to the load based on feedback. In the case of a power supply the feedback is derived from monitoring the output voltage or current (depending on the output control mode) with an error amplifier and using this to control the duty cycle. in the case of a fuel injection system I would think the feedback is provided by some sort of tranducer(s) that indicate duel demand vs. fuel flow.

In either case - you have a PWM signal that controls an output and most important FEEDBACK to indicate and adjust such control. In any feedback loop - the feedback is always NEGATIVE feedback for an amplifier (feed back input is opposite phase or polarity to amplifier output)- if the feedback ever becomes POSITIVE feedback (feedback input is same phase or polarity as amplifier output) you will no longer have an amplifier - you will then have an OSCILLATOR.

Switch mode power supplies came into regular use in the early to mid 80's and in those days were all of the constant frequency type. The constant frequency type of system has an inherent region of instability that occurs whenever the duty cycle is greater than 50%.

The reason for this - in obtuse language - is that the feedback loop transfer function has a pole in the right half-plane quadrant. In plain language this means that there will always be a phase change such that without some method of compensation will cause instability of the systems at duty cycles greater than 50%. Such compensation is easy and cheap to do and usually only involves the addition of capacitance in the feedback loop to provide for some integration factor. The problem with that approach is that such compensation is load dependent - and will be different for different loads.

In the case of an AC amplifier such compensation is called frequency compensation - for DC control loops such is called slope compensation. Regardless of what you want to call it - it is always necessary because in any feedback loop there will always be reactive elements (stray capacitance and inductance in the feedback path) that will change the phase of the feedback such that at certain frequencies the feedback would cause the control loop to become unstable.

Since about the early to mid 2000's - nearly all switch mode systems employ either a constant off-time or constant on-time approach and change the switching frequency to adjust the duty cycle. As an example - say you are switching at a frequency of 100 kHz and you have a constant off-time of 1 uS. Here you will have a duty cycle of 90% (1 uS off time / 10 uS total time = 0.1 = 10% off = 90% duty cycle). So if you change your switching frequency to 300 kHz you will then have a duty cycle of 70% (1 uS off time/ 3.33 uS total time = 0.3 = 70% duty cycle).

The advantage of this technique is that the slope compensation is ALWAYS the same, and can be built into the silicon itself instead of using external resistors and capacitors with values that must be tailored to the load conditions.

I would make a guess here that due to the time of the Celica ECU design - the constant frequency method was used for the injector control and thus that the duty cycle might be limited to 60% max because of the above mentioned limitations of that technique.

So - for the need of extra fuel enrichment at cold temps they added the additional cold start injector because they could not increase the duty cycle of the regular injectors beyond ~50-55% due to instability of the feedback loop.