|
Hi, I had the same problem, the solution I used is here.
You can also try IBM Client Access. (we have version 5 release 2)
"IBM DB2 UDB for iSeries OLE DB Provider"
With this I managed to set up the connection through vs ide (server explorer) and view the data.
Connection string I got from UDL file:
Provider=IBMDA400;Password="";Persist Security Info=True;User ID=ME;Data Source=xxx.xxx.xxx.xxx;Protection Level=None;Initial Catalog=AVISPAR2;Transport Product=Client Access;SSL=DEFAULT;Force Translate=65535;Default Collection="";Convert Date Time To Char=TRUE;Catalog Library List="";Cursor Sensitivity=3
Good luck!!
|
|
|
|
|
something like Profier of SQL Server, thanks!
|
|
|
|
|
No such luck!! Access (since it's essentially a single user desktop DB) generally has no need for such niceties.
You may however be able to turn on ODBC tracing....go to datsources dialog, specify a log file and click Start Tracing. Once you have the info you need MAKE SURE YOU STOP THE TRACE - this is V Important - tracing will cripple the performance of your machine, and is persistent across reboots....
"Now I guess I'll sit back and watch people misinterpret what I just said......"
Christian Graus At The Soapbox
|
|
|
|
|
thanks!yes, ODBC tracing can do for ODBC-Based App, but how can i monitor the app based on ADO or OLEDB?
|
|
|
|
|
hi,
currently i doing on merge replication.I need to configure my computer as publisher and distributor.When configure the publishing and distribution using wizard i get this message:
Error: 18483 : Could not connect to server "se13" because "distributor_admin" is not defined as a remote login at the server.
i try to reinstall my sql server service pack 3 and the message disappear but come out a new message:
Error:14114: '(null)' is not configured as a distributor
"se13" is my computer name.
thanks. 
|
|
|
|
|
Hello all,
When I close a web form that has a connection to my SQL Server, I am not seeing the memory process close in task manager (of the SQL Server). I am using the "open late close early" theory of database connections. I am using the "close" method for my database connections. Is there any automated utility that will shut down these processes? I thought when the user was disconnected from the database, the memory process would automatically shut down.
Any suggestions, thoughts, or ideas?
TYIA,
lonelobo
|
|
|
|
|
Suspect that what you are seeing is the SQLServer process itself...you really don't want this to be shutting down (do you?)...if it shuts down, then the server shuts down, and when you attempt to connect, it will fail.
When you CONNECT to a SQL Server DB, you just connect to a running instance of it, not (a la Access) run up a process etc.
Hope this helps, and apologies if have been barking up wrong tree!
"Now I guess I'll sit back and watch people misinterpret what I just said......"
Christian Graus At The Soapbox
|
|
|
|
|
Hi All, I am reporting with Crystal Reports for .net off of a live access database up on a web server. The database design sort of replicates a parts inventory and the way it was designed almost replicates recursion. For example there is a parts table and each row in the table has a part number, details, and a style number. The style number tells us whether it is a level 1 part, level 2 part, or a level 3 part. This database design supports an infinite number of nesting of parts within other parts using another table linked to the parts tables called packages with two links to the parts table. I am using ADO.Net datasets to report off of I am looking for a way to report on for instance all the level 1 or level 2 parts in a level 3 part or any combination like that. Has anyone ever created a crystal report that recursively looked up records in a database. Thanks in advance my head is going to explode soon I think
Frank
SHIKAKA
|
|
|
|
|
I think you should use Hierarchical Grouping option in the Report Menu.
|
|
|
|
|
Hello all
I'm working on a data processing application which processes records in multiple (unrelated) tables containing about 2 million records.
Basically it runs through each record, checks various fields in each row according to certain business rules, and potentially updates other fields in the same row with updated values.
Here's my dilemma. I believe I have only 2 options available to me -
1. Using DataSet - I havent tried this yet but with 2 million records, it seems like its a very bad idea to try and pull a dataset of the whole table into memory while processing. Is there a way to page the dataset such that I can pull in only x number of rows into memory at a time?
2. Using DataReader - since I have to update the row after processing, using a data reader by itself won't work. I am currently using a datareader for iterating through the table and SqlCommands to update fields (though this seems to be prone to timeouts - I suspect that the writes to the database are locking out the datareader or vice versa)
Is there a better way to go about doing this sort of processing? Any suggestions would be greatly appreciated.
|
|
|
|
|
Do you relly have to load all two million rows at a time? Can't you restrict number of it with your query conditions? You can also use stored procedures to do you jobs in database side instead of application side.This have better performance.
Mazy
"One who dives deep gets the pearls,the burning desire for realization brings the goal nearer." - Babuji
|
|
|
|
|
Running through 2 million records on a SQL client kind of defeats the purpose of a SQL Server. I would let the SQL Server handle the task by one of these two methods:
1) Use a stored procedure that implements your business rules.
OR
2) Code an Update statement that implements your business rules.
|
|
|
|
|
I agree with Michael.
In this case, it is better if you use a Stored Procedure or an Update statement that has all the conditions (business rules) specified in it.
Edbert P.
Sydney, Australia.
|
|
|
|
|
Thanks for the replies folks. But there's a good reason why I need to do this on the application side. The rules are fairly complex and involve a lot of calculations so implementing them in stored procs or sql statements is out of the question.
And I do need to process every record. Here's a couple of ways I am thinking of doing it:
1. Pull in x number records at a time based off a key field and process and update them before moving on to the next x number of records.
2. The other way is to use a datareader for running through the table and batch the SQL update commands somehow so that after X number of commands it executes them.
I've been told option #1 would likely be less efficient than #2 because of the overhead with creating and using DataSets and the fact that DataSets use Datareader internally anyway which would likely just be equivalent to option #2.
Any thoughts? Any better way to go about doing this?
|
|
|
|
|
I am using SQL server to handle my session variables. However every time I reboot the server the permission to call the SELECT queries get reset and I have to manual set them again.
To create the SQL Session server I used the script InstallSqlState.sql, should I have used InstallPersistSqlState.sql?
Thanks,
|
|
|
|
|
Well I figured it out, the answer is to write a procedure that goes off to set the permissions. Just have this code run by the SQL agent
use tempdb
Declare @userName char(100)
Declare @DatabaseUserID [smallint]
--internal sql user name or domain\username
Set @userName = 'InternalUser'
select @DatabaseUserID = [sysusers].[uid] from sysusers where name = @userName
IF @DatabaseUserID IS NULL
BEGIN
EXEC sp_grantdbaccess @userName
EXEC [sp_addrolemember]
@rolename = 'db_datareader',
@membername = @userName
EXEC [sp_addrolemember]
@rolename = 'db_datawriter',
@membername = @userName
END
|
|
|
|
|
How can i add a text file in my database
like right now i am making an application in which i have to save all the text
the text will be about say max 3 pages each file
can i save it in SQL server
and if i can how do i do it
i will also need to concatenate string to it....
plz help,
i am a beginner
XANEB
|
|
|
|
|
I can help you with concatenation of text strings. See next page of the phorum "ntext concatenation". If you need more explanations or examples I'll send it on demand.
My best regards
|
|
|
|
|
You could attempt to parse the file line by line, and add to a VARCHAR field in the DB...
"Now I guess I'll sit back and watch people misinterpret what I just said......"
Christian Graus At The Soapbox
|
|
|
|
|
Hi All,
I have a scheduled task to back up my database every day at specified time.IF i continuosly change my system year,error_handler function is getting called with the following parameter values
severity = 9
dberr = 10025
oserr=65534
dberrstr=possible network error: write to sql server failed.general network error.check your documentation
oserrstr=ConnectionRead(recv())
what might be the reason for this error.
Thanks in advance
Raghu
|
|
|
|
|
hi
how can i change connection string to connect sql server reporting services report in case of changing login or pwd of database.....or chnage of database with same shema...
RAF
SE-Netsol Pakistan
|
|
|
|
|
If you use .net framework , store it in a config file of your application so you can change it without recompiling it or store it in a file which you can change it easily , you can also force user to enter his/her username/password. Another way is to use windows authentication for you sql , so no username/password required in connection string.
Mazy
"One who dives deep gets the pearls,the burning desire for realization brings the goal nearer." - Babuji
|
|
|
|
|
Hi,
Please help me...
I ve 2 tables like one is 'emp' & other is copy of 'empc'
Wht i want...whenever i update the emp table the old contents should be moved to the empc table & the new should be overwritten in the emp table...
& i want all this should be done through a trigger...may be on update... but...this will overwrite the contains of emp...so wht should i do....
Thanks
|
|
|
|
|
hi
you need something like this :
create trigger empCopyTrig on emp
for Update
as
insert into empc Select * from deleted
Go
try it..Waiting for feedback 
|
|
|
|
|
When using triggers to update a table, the rows which will be affected by the update statement will be available in 'deleted' table and records with updated changes will be available in 'inserted' table. These two tables are available only during trigger execution and not outside the scope of a trigger.
BK
Bhaskara
|
|
|
|