|
I would say that performance-wise an database engine is going to be faster than shifting all that data over the network to the client, processing it and sending it back again.
With the proviso that you set up the indexes correctly I would say that SQL will be the fastest method of processing the data from a computer point of view(in terms of the human side and GUI etc. that is something only you would know about).
Go with SQL as that is what you are most comfortable with and correctly created indexes can helps thing fly.
I would avoid cursors and use temporary tables(don’t forget to add indexes to the temporary tables too) doing the processing in steps – my experience is that this is the fastest way of processing large quantities of data.
“That which can be asserted without evidence, can be dismissed without evidence.”
― Christopher Hitchens
|
|
|
|
|
We are well experienced with TSQL and the importance of indexing correctly, it just that one of the senior devs suggested a c# solution so I though I'd get some other opinions.
I have a rule of thumb that uses table vars for small reference type info, temp table with indexing for serious volume and cursors only under duress.
Never underestimate the power of human stupidity
RAH
|
|
|
|
|
Sounds like you know exactly what you need to use
I get the checking with others because a senior dev suggested something
“That which can be asserted without evidence, can be dismissed without evidence.”
― Christopher Hitchens
|
|
|
|
|
There's a third option, you don't need to choose between C# or processing in the database.
You can do both using a CLR SQL Server User-Defined Function[^].
That should appeal to both you and that senior developer.
Note that I've never done it, so I can't say how much fuzz there is to it. But I know that MS invented it with performance in mind.
|
|
|
|
|
Per the other post... "process each record 36 to 120 times"
Given that it seems likely that the processing isn't going to be simple. So that suggests the TSQL is going to be rather CPU intensive.
So what is the expectation of other work that the database needs to do at the same time that this runs, now and in the future?
And what is the growth rate of the record set?
Does this occur every day?
Moving records out of and into the system is of concern but given the processing numbers above it is something I would consider. A separate application allows processing to be moved off box (easier at least with my experience.)
|
|
|
|
|
I have done this on both sides and the DB side is much much faster. I have big data modeling app (35 million transactions across 20 tables) with DB procedures and a big ETL app that has to exist and run on the MSoft client side. The DB is much much faster.
|
|
|
|
|
Quote:
select distinct(Emp_Status),
count ( Att_Mints/60) as OTHOUR,
COUNT (Att_Totalmints /60 ) as ProductionHours
from Attendence
inner join EmployeeMaster on fk_Att_EmpCode=fk_Att_EmpCode
where year(Att_Date)='2014' and Month (Att_Date)='1'
group by Emp_Status
in above query Count of
Att_Mints is =150
and count of
Att_Totalmints is 120
i want output of OTHOURS=1:30min
and
ProductionHours =2 hour
please help me
|
|
|
|
|
|
<pre lang="vb">DECLARE @CheckQuantity INT
SET @ParmDefinition = N'@CheckQuantity INT'
SET @SQL = N'
SET @CheckQuantity= 12'
PRINT @SQL
EXEC [dbo].sp_executesql @SQL, @ParmDefinition,
@CheckQuantity=@CheckQuantity;
PRINT @CheckQuantity</pre>
It Should Print 12 but the output is NULL , Please tell me How to SET VALUE For the VAriable declare outside the SQL Text cretaed for dynamic query
|
|
|
|
|
any one who has solution for this
|
|
|
|
|
You need to make the parameter an OUTPUT parameter.
https://support.microsoft.com/kb/262499[^]
DECLARE @CheckQuantity INT;
SET @ParmDefinition N'@CheckQuantity INT OUTPUT';
SET @SQL = N'SET @CheckQuantity= 12';
PRINT @SQL;
EXEC [dbo].sp_executesql @SQL, @ParmDefinition, @CheckQuantity = @CheckQuantity OUTPUT;
PRINT @CheckQuantity;
"These people looked deep within my soul and assigned me a number based on the order in which I joined."
- Homer
|
|
|
|
|
Thank you friend it's working now
|
|
|
|
|
Hi Team,
We have an application called as TMART. Basically TMART is used to monitor each application such as web,citrix etc.
Under a particular project we have multiple monitor that are getting monitored.
We would like to have a sql query through which we can pull the availability error.
Can some one please provide us with the sql query for the same.
Its bit urgent...
Your response would be highly appreciated.
|
|
|
|
|
It is impossible to give you an answer without details of the tables and columns involved, so please give us the information needed to help you.
=========================================================
I'm an optoholic - my glass is always half full of vodka.
=========================================================
|
|
|
|
|
Hi Chris,
I am really happy for the quick response.
But there is no table and columns involved into this.
We have the data as below:-
(Complete Time of the Period – (Addition of Failures Durations)) / (N° of Failures + 1)*
*for n failures, there n + 1 periods of functioning, and therefore:
For example for a month as January with 44 640 seconds for the complete period and errors as below:
• From 1/1/14 09:17 to 1/1/14 17:20, (error 1 duration = 483 minutes)
• From 14/1/14 17:20 to 15/1/14 07:40 (error 2 duration = 860 minutes)
• From 21/1/14 07:40 to 21/1/14 11:12 (error 3 duration = 212 minutes)
The MTBF would be:
(44 640 - (483 + 860 + 212)) / 4 = 5 385 minutes 30s, MTBF is 89 hours 45mn 30s = 3 days 17 hours 45mn 30s.
This is what we require for... 
|
|
|
|
|
If there are not tables or columns, mhow can you expect a SQL query to work?
How is the data held, where is the data held?
=========================================================
I'm an optoholic - my glass is always half full of vodka.
=========================================================
|
|
|
|
|
Don't tell me it is a text report and you are hoping to query the text file using TSQL - not going to work.
First you need to parse the data into logical fields using the correct data types. Then you may query the TABLE and COLUMNs to get the results you need.
Never underestimate the power of human stupidity
RAH
|
|
|
|
|
my query like this
cte3_persen (per) as
(select ((cte1.totalcount/cte2.TotaCount)* 100 )
from cte1,cte2)
i am creating one new cte table shown above
from that value '2'coming from cte1.totalcount and value '500' coming from cte2.TotaCount
i want percentage of those value it should be in 0.00% format
i want ans of per from cte3_persen table is =0.40%
please help some one
|
|
|
|
|
What are your data types for the TotalCount variables? are they of Decimal type?
if so then have a look at this
DECLARE @value1 DECIMAL(18,2)
DECLARE @value2 DECIMAL(18,2)
SET @value1 = 2
SET @value2 = 500
SELECT (@value1 / @value2)*100
SELECT CAST((@value1 / @value2)*100 AS DECIMAL(18,2))
Every day, thousands of innocent plants are killed by vegetarians.
Help end the violence EAT BACON
|
|
|
|
|
|
is that the same for TotalCount from the second CTE?
Every day, thousands of innocent plants are killed by vegetarians.
Help end the violence EAT BACON
|
|
|
|
|
Sounds like you want to round your result to 2 decimal places before dividing by 100
SELECT
(2/500)* 100 AS not_rounded,
(round(2/500,2))* 100 AS rounded
from dual;
returns
NOT_ROUNDED ROUNDED
----------- -------
0.4 0
Regrads
|
|
|
|
|
Which is the best database to manipulate 100 million records which configures with c# win. form.
|
|
|
|
|
All the known databases - SQL or NoSQL - can easily handle a 100 million records. All of them has also binding to .NET (so to C#)...
You may map your request on that data, store only, OLAP, more read than write...
I'm not questioning your powers of observation; I'm merely remarking upon the paradox of asking a masked man who he is. (V)
|
|
|
|
|
actually i am already working on sql but i am not sure that it can manage 100 million record. somebody suggested me oracle is oracle better than SQL.
|
|
|
|