Smokem posted a link in another one of your threads to an article that thoroughly explained the relationship between air density in the combustion chamber and injection quality.
The research group tested the distance of maximum atomization. What i gathered is that there is a certain distance where the spray pattern reaches maximum atomization/mixes with the swirling air charge. At distances farther from the nozzle than this ideal distance, the fuel mist starts to reform into larger slow moving droplets. The ideal balance for efficiency was to have this ideal distance occur at the outer edges of the bowl.
The trick to atomization was that as air density increased, the distance for ideal atomization shortened.
For example, a nozzle designed to spray to the outer edges of combustion bowl at high load/high air charge density will spray too far at low engine load and result in fuel making it to the piston surface and therefore become wasted.
As I recall, the research paper discussed 4 ways to make the spray penetrate farther into the swirl of compressed air.
1. Larger hole
2. Higher injection pressure
3. Tapering the nozzle holes to resemble a funnel or cone with the larger end towards the inside of the injector and smaller end towards the combustion chamber.
4. Lengthening the holes of the nozzle.
Obviously #2 is why common rail systems that ramp up pressure have an edge on traditional injection systems. Nozzles can be designed to deliver fuel the outer edges of the bowl at all engine load/ charge density conditions.
Think about a 12 valve with a shower head nozzle. It dumps a ton of fuel that only cleans up at 120 psi boost when the charge air density is extremely high. That same injector at idle is smokey because the fuel spray is too large and it "washes down the cylinder" or hits the piston face before it can combine with air and burn.