Hi,
Sorry for the late reply, and thanks everyone for the suggestions.
I did not provide much details because I am new to this domain and I wanted to see from
the reactions if maybe I am not missing some relevant direction.
We are trying to measure how an Oracle database can cope with an increase in usage
(basically, there will be more users for the application). We are basing our analysis on
typical SQL statements coming from the application. We are currently doing load testing
by:
- recording sql statements from some use cases that are considered to be important
- generalizing them by replacing actual values with generic variables
- providing meaningful values for the generic variables
- replaying them against the database from several client machines
- consuming the first record from the responses
- reporting the timing of the statements
- recording the CPU, memory and I/O load of the server
However, I am interested in pitfalls, and in the way people interpret the results given
that it is hard to determine what is a typical usage in terms of what statements to
trigger and at what delays.
Cheers,
Doru
On 5 Feb 2012, at 19:02, Philippe Marschall wrote:
On 03.02.2012 16:11, Tudor Girba wrote:
Hi,
Do you happen to know methods to approach the problem of testing the
capacity of an application to work with an SQL database?
Which capacity?
In particular, I am interested in simulating
concurrent requests
towards this database that would resemble the interaction coming from
the application.
What are you trying to measure?
Well you know that benchmarking is hard, don't you? I see two possible ways. First
recording the SQL statements and then replaying them (have fun with bind parameters).
Second just running the application itself.
Oh yeah, Oracle has a tool named RAT, no idea what the marketing department though there.
Maybe you can play similar tricks with PITR in PostgreS.
Cheers
Philippe
--
www.tudorgirba.com
"Problem solving efficiency grows with the abstractness level of problem
understanding."