05.11.2015 Views

Apress.Expert.Oracle.Database.Architecture.9i.and.10g.Programming.Techniques.and.Solutions.Sep.2005

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

168<br />

CHAPTER 5 ■ ORACLE PROCESSES<br />

Figure 5-3. Concurrent users vs. transactions per second<br />

Initially, as you add concurrent users, the number of transactions increases. At some<br />

point, however, adding additional users does not increase the number of transactions you<br />

can perform per second—the graph tends to drop off. The throughput has peaked <strong>and</strong> now<br />

response time starts to increase (you are doing the same number of transactions per second,<br />

but the end users are observing slower response times). As you continue adding users, you will<br />

find that the throughput will actually start to decline. The concurrent user count before this<br />

drop-off is the maximum degree of concurrency you want to allow on the system. Beyond this<br />

point, the system becomes flooded <strong>and</strong> queues begin forming to perform work. Much like a<br />

backup at a tollbooth, the system can no longer keep up. Not only does response time rise dramatically<br />

at this point, but throughput from the system may fall as well as the overhead of<br />

simply context switching <strong>and</strong> sharing resources between too many consumers takes additional<br />

resources itself. If we limit the maximum concurrency to the point right before this<br />

drop, we can sustain maximum throughput <strong>and</strong> minimize the increase in response time for<br />

most users. Shared server allows us to limit the maximum degree of concurrency on our system<br />

to this number.<br />

An analogy for this process could be a simple door. The width of the door <strong>and</strong> the width of<br />

people limit the maximum people per minute throughput. At low “load,” there is no problem;<br />

however, as more people approach, some forced waiting occurs (CPU time slice). If a lot of<br />

people want to get through the door, we get the fallback effect—there are so many saying<br />

“after you” <strong>and</strong> false starts that the throughput falls. Everybody gets delayed getting through.<br />

Using a queue means the throughput increases, some people get through the door almost as<br />

fast as if there was no queue, while others (the ones put at the end of the queue) experience<br />

the greatest delay <strong>and</strong> might fret that “this was a bad idea.” But when you measure how fast

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!