2012年2月9日星期四

basic SQL question

I'm not sure if this is the correct forum or not, but I have a basic question. Currently we have are doing calculations via stored procedures and then returning the results back to the client in either a web page or a winForm style application. There are times that the stored procedure takes 20 minutes or 3 hours to run. Now, our database person says that all the calculations needs to be removed from SQL and put in either the web applications code or the winForms code (based on what is calling it) the db person also states that SQL is not intended to do any kind of math (calculations) and they should all be done in the application. I think the calcs should be done on the db and its going to take 20 mins to 3 hours for these things to run no matter where the calcs are happening.

So, my question is: Where should the math(calculations) take place? In the Stored Procedure or the application side?

I am moving this to the "Transact-SQL" forum, which has a better chance of providing a good feedback about suitability of stored procedures for math calculations.

|||For precalculated data you might consider another storage than the relational one. Use a multidimensional storage like Analysis Services. It will help you to precalculate data. This will also speed up your application to light speed, having the reults at the tip of your fingers.

HTH, Jens Suessmeyer.

http://www.sqlserver2005.de|||

>> Where should the math(calculations) take place? In the Stored Procedure or the application side?

It depends on various requirements. There is no simple answer. In SQL Server 2005, you can get potentially a performance boost by implementing the CPU intensive calculations in CLR (C#/VB.NET) and running it on the database. See the link below for a whitepaper on using SQLCLR:

http://msdn.microsoft.com/library/default.asp?url=/library/en-us/dnsql90/html/sqlclrguidance.asp

It is also not clear if the bottleneck is the calculations or other logic in your SPs. For example, are using cursors or temporary tables a lot? Do you table variables that contain large amounts of data? Are you writing procedural code that touches large number of rows?

|||

My guess is that it's more a question of how data is retrieved and calculated upon, than the fact that there is calculations done in T-SQL code.

As a comparison, I'm a bit involved in an old legacy system that does heavy financial calculations in T-SQL, where the data is read from approx a million rows of raw data, and returns the same amount of rows, but just about each and every column is calculated upon in some way. We're talking derived tables, nested cases several levels, almost every math function available, additions, subtractions, divisions, multiplications, literally pages and pages of T-SQL code for a single query in QA.

These 'math queries' complete in a few minutes nevertheless. There is no looping or cursoring involved, just setbased, though with a zillion math expressions, but still performing 'good enough' for our purposes.

Of course there may be other factors, such as hardware that may make or break stuff like this, but this is done on fairly 'standard' equipment of today. 4-way, 8 Gigs RAM, SAN cabinet for disks... Not *that* fancy at all.

/Kenneth

没有评论:

发表评论