|
Suspect that what you are seeing is the SQLServer process itself...you really don't want this to be shutting down (do you?)...if it shuts down, then the server shuts down, and when you attempt to connect, it will fail.
When you CONNECT to a SQL Server DB, you just connect to a running instance of it, not (a la Access) run up a process etc.
Hope this helps, and apologies if have been barking up wrong tree!
"Now I guess I'll sit back and watch people misinterpret what I just said......"
Christian Graus At The Soapbox
|
|
|
|
|
Hi All, I am reporting with Crystal Reports for .net off of a live access database up on a web server. The database design sort of replicates a parts inventory and the way it was designed almost replicates recursion. For example there is a parts table and each row in the table has a part number, details, and a style number. The style number tells us whether it is a level 1 part, level 2 part, or a level 3 part. This database design supports an infinite number of nesting of parts within other parts using another table linked to the parts tables called packages with two links to the parts table. I am using ADO.Net datasets to report off of I am looking for a way to report on for instance all the level 1 or level 2 parts in a level 3 part or any combination like that. Has anyone ever created a crystal report that recursively looked up records in a database. Thanks in advance my head is going to explode soon I think
Frank
SHIKAKA
|
|
|
|
|
I think you should use Hierarchical Grouping option in the Report Menu.
|
|
|
|
|
Hello all
I'm working on a data processing application which processes records in multiple (unrelated) tables containing about 2 million records.
Basically it runs through each record, checks various fields in each row according to certain business rules, and potentially updates other fields in the same row with updated values.
Here's my dilemma. I believe I have only 2 options available to me -
1. Using DataSet - I havent tried this yet but with 2 million records, it seems like its a very bad idea to try and pull a dataset of the whole table into memory while processing. Is there a way to page the dataset such that I can pull in only x number of rows into memory at a time?
2. Using DataReader - since I have to update the row after processing, using a data reader by itself won't work. I am currently using a datareader for iterating through the table and SqlCommands to update fields (though this seems to be prone to timeouts - I suspect that the writes to the database are locking out the datareader or vice versa)
Is there a better way to go about doing this sort of processing? Any suggestions would be greatly appreciated.
|
|
|
|
|
Do you relly have to load all two million rows at a time? Can't you restrict number of it with your query conditions? You can also use stored procedures to do you jobs in database side instead of application side.This have better performance.
Mazy
"One who dives deep gets the pearls,the burning desire for realization brings the goal nearer." - Babuji
|
|
|
|
|
Running through 2 million records on a SQL client kind of defeats the purpose of a SQL Server. I would let the SQL Server handle the task by one of these two methods:
1) Use a stored procedure that implements your business rules.
OR
2) Code an Update statement that implements your business rules.
|
|
|
|
|
I agree with Michael.
In this case, it is better if you use a Stored Procedure or an Update statement that has all the conditions (business rules) specified in it.
Edbert P.
Sydney, Australia.
|
|
|
|
|
Thanks for the replies folks. But there's a good reason why I need to do this on the application side. The rules are fairly complex and involve a lot of calculations so implementing them in stored procs or sql statements is out of the question.
And I do need to process every record. Here's a couple of ways I am thinking of doing it:
1. Pull in x number records at a time based off a key field and process and update them before moving on to the next x number of records.
2. The other way is to use a datareader for running through the table and batch the SQL update commands somehow so that after X number of commands it executes them.
I've been told option #1 would likely be less efficient than #2 because of the overhead with creating and using DataSets and the fact that DataSets use Datareader internally anyway which would likely just be equivalent to option #2.
Any thoughts? Any better way to go about doing this?
|
|
|
|
|
I am using SQL server to handle my session variables. However every time I reboot the server the permission to call the SELECT queries get reset and I have to manual set them again.
To create the SQL Session server I used the script InstallSqlState.sql, should I have used InstallPersistSqlState.sql?
Thanks,
|
|
|
|
|
Well I figured it out, the answer is to write a procedure that goes off to set the permissions. Just have this code run by the SQL agent
use tempdb
Declare @userName char(100)
Declare @DatabaseUserID [smallint]
--internal sql user name or domain\username
Set @userName = 'InternalUser'
select @DatabaseUserID = [sysusers].[uid] from sysusers where name = @userName
IF @DatabaseUserID IS NULL
BEGIN
EXEC sp_grantdbaccess @userName
EXEC [sp_addrolemember]
@rolename = 'db_datareader',
@membername = @userName
EXEC [sp_addrolemember]
@rolename = 'db_datawriter',
@membername = @userName
END
|
|
|
|
|
How can i add a text file in my database
like right now i am making an application in which i have to save all the text
the text will be about say max 3 pages each file
can i save it in SQL server
and if i can how do i do it
i will also need to concatenate string to it....
plz help,
i am a beginner
XANEB
|
|
|
|
|
I can help you with concatenation of text strings. See next page of the phorum "ntext concatenation". If you need more explanations or examples I'll send it on demand.
My best regards
|
|
|
|
|
You could attempt to parse the file line by line, and add to a VARCHAR field in the DB...
"Now I guess I'll sit back and watch people misinterpret what I just said......"
Christian Graus At The Soapbox
|
|
|
|
|
Hi All,
I have a scheduled task to back up my database every day at specified time.IF i continuosly change my system year,error_handler function is getting called with the following parameter values
severity = 9
dberr = 10025
oserr=65534
dberrstr=possible network error: write to sql server failed.general network error.check your documentation
oserrstr=ConnectionRead(recv())
what might be the reason for this error.
Thanks in advance
Raghu
|
|
|
|
|
hi
how can i change connection string to connect sql server reporting services report in case of changing login or pwd of database.....or chnage of database with same shema...
RAF
SE-Netsol Pakistan
|
|
|
|
|
If you use .net framework , store it in a config file of your application so you can change it without recompiling it or store it in a file which you can change it easily , you can also force user to enter his/her username/password. Another way is to use windows authentication for you sql , so no username/password required in connection string.
Mazy
"One who dives deep gets the pearls,the burning desire for realization brings the goal nearer." - Babuji
|
|
|
|
|
Hi,
Please help me...
I ve 2 tables like one is 'emp' & other is copy of 'empc'
Wht i want...whenever i update the emp table the old contents should be moved to the empc table & the new should be overwritten in the emp table...
& i want all this should be done through a trigger...may be on update... but...this will overwrite the contains of emp...so wht should i do....
Thanks
|
|
|
|
|
hi
you need something like this :
create trigger empCopyTrig on emp
for Update
as
insert into empc Select * from deleted
Go
try it..Waiting for feedback 
|
|
|
|
|
When using triggers to update a table, the rows which will be affected by the update statement will be available in 'deleted' table and records with updated changes will be available in 'inserted' table. These two tables are available only during trigger execution and not outside the scope of a trigger.
BK
Bhaskara
|
|
|
|
|
Hi,
Let's face it, I don't understand how I can do the following. Let's explain a little :
* I have an online database (in fact mysql). I access this base through ByteFX's MySQLDataAdapter.
* I have an application which can read data from it but also can modify them WITHOUT necessary doing a fill (updating the database back). In fact, this application could be closed without having done the updates, inserts, etc.
* I do not have issues like online datas beeing changed or multi-people acceding the datas. Only this application is responsible of the database.
What I want is when the application is launched again, the "not updated yet" part of the datas are still there. So I need to cache them in some file.
What is the best approach to do that ?
Thanks
|
|
|
|
|
Hi,
If there is a problem of data which is not updated in database are still there in application.
You can go for cache the web page using the output caching for specifying some duration of time.
<%@ outputcache=5%> --> this take 5 minutes duration of time.after that i getting the refreshed data to display on the page.
i think this may work for you.. instead of cache in files.
regards,
sukesh.g
Sukesh.g
|
|
|
|
|
how tp retrieve real hard drive serial number
|
|
|
|
|
If you use .net see this article. You can do it with WMI:
http://www.codeproject.com/csharp/wmi.asp[^]
Mazy
"One who dives deep gets the pearls,the burning desire for realization brings the goal nearer." - Babuji
|
|
|
|
|
|
Hello All,
I have 2 SQL2000 DBs on the same server, in DB#1 I have a stored procedure from which I need to get a field from a table that exists in DB#2.
Any Ideas,
Thanks in advance
|
|
|
|