|
to get an answer, you should first ask the right forum...
then, be sure that everyone can understand what you mean (for example, by not considering people knowing what ijvm is)...
one more thing : be pleasant over the forum, otherwise no one will help you. cross posting the forums under several acounts (referring this[^]) will never deserve you...
TOXCCT >>> GEII power
[VisualCalc 3.0 updated ][Flags Beginner's Guide new! ]
|
|
|
|
|
I am using .net 1.1 /sql server 2000, recently moved from the old Application block to Entlib data application block June 2005.
In all our applications we store the connection string in web.config file using custom encryption (VB dll).
ex: <add key="ConnectionString" value="2D2F447DA70A2CF1E1A61F2475B3D7F67592A67ACF108FFE9E08CFCE3C6463B5C175B6E6A5"/>
we used to decrypt the same and pass the conn string to sqlhelper. The new DAAB provides only :
Dim db As Database = DatabaseFactory.CreateDatabase()
without an option to pass connection string, i can use the dataConfiguration.config to store connection string but how to encrypt and decrypt. I cant use clear text for connection strings any other encryption methodology also cant be used.
Please help i am unable to find a solution, code snippets are very much needed.
|
|
|
|
|
Hi all,
We are writing a window service which tries to access a database server on
different network.
while connecting to one of the database it give error saying 'EXECUTE
permission denied on object 'sp_sdidebug', database 'master', owner 'dbo''.
But if we are trying to connect to the same Database with any web or windows
application it does not raise any error and we are able to connect to that
DB.
Thanks for your help
Praveen
|
|
|
|
|
Praveen_S wrote: But if we are trying to connect to the same Database with any web or windows
application it does not raise any error and we are able to connect to that
DB.
The error message you receive shows that you are actually connected to the database - otherwise it could not tell you about the individual access permissions on database objects.
The error is because the user you are connecting as does not have permission to EXECUTE the stored procedure. Check the account that is being used to connect to the database and check that account has sufficient privileges to run the desired stored procedure.
"On two occasions, I have been asked [by members of Parliament], 'Pray, Mr. Babbage, if you put into the machine wrong figures, will the right answers come out?' I am not able to rightly apprehend the kind of confusion of ideas that could provoke such a question."
--Charles Babbage (1791-1871)
My: Website | Blog
|
|
|
|
|
Uncheck 'Enable SQL Debugging' in Project Properties. This likely doesn't happen for anyone else because this is one of the options stored in the .user file (e.g. myproj.csproj.user), which is generally not placed under source control.
That's assuming you're getting that error when trying to debug. If you're getting it at some other time I don't know what's wrong.
Stability. What an interesting concept. -- Chris Maunder
|
|
|
|
|
This is on SQL Server 2000, I don't think it's possible, but maybe there is some way of doing it, or another better idea?
If I have two tables, and I want to insert a new record into both of them, and at the same time set a field in one table to be the identity inserted field in the other, can that be done? e.g. the SQL I'd want to work would be for example:
insert into tablea inner join tableb on tablea.bid = tableb.id( a, b, c, bid ) values( 1, 2, 3, tableb.id)
|
|
|
|
|
|
OK, thanks, that's a good idea. But the question still remains about the SQL to use in the stored procedure.
|
|
|
|
|
You can retrieve the value of the identity column from the last insert performed using the @@IDENTITY variable. Due to the action of triggers, however, this can be the wrong value if one or more triggers that fired also performed an insert. On SQL Server 2000 and later the SCOPE_IDENTITY() function is a better choice.
So something like:
INSERT INTO tableb( c )
VALUES( 3 )
INSERT INTO tablea( a, b, bid )
VALUES( 1, 2, SCOPE_IDENTITY() )
Stability. What an interesting concept. -- Chris Maunder
|
|
|
|
|
That's great, thanks Mike!
|
|
|
|
|
Hello All, I recently switched a large ASP.NET (VB) application over from using SQLClient to ODBCClient. While this change was largely uneventful, it caused one disturbing piece of fallout. Any place where I was populating a DataTable with the results of a Stored Procedure now fails. The code actually executes the SP, (I can see that on the database) but when the results are brought back to the server, the DataTable has a .Rows.Count = 0.
I have found a fix, but don't want to update the dozens of places in my code. The fix, strangely enough, is to fill a dataset instead of a datatable, then use the table in the dataset. Strange.
My code that worked with SQLClient and not with ODBCClient:
Dim sSQL as String = "exec my_proc " & sArg
Dim oConn As New OdbcConnection(sConString)
Dim oCommand As New OdbcDataAdapter(sSQL, oConn)
Dim dt As New DataTable
oConn.Open()
Try
oCommand.Fill(dt)
Catch ex As Exception
PrintError(ex.Message, ex.StackTrace)
End Try
oConn.Close()
Response.Write(dt.Rows.Count) ==> yields "0"
This works:
Dim sSQL as String = "exec my_proc " & sArg
Dim oConn As New OdbcConnection(sConString)
Dim oCommand As New OdbcDataAdapter(sSQL, oConn)
Dim ds as New DataSet
oConn.Open()
Try
oCommand.Fill(ds)
Catch ex As Exception
PrintError(ex.Message, ex.StackTrace)
End Try
oConn.Close()
Response.Write(ds.Tables(0).Rows.Count) ==> yields "49"
Being thorough, I realize this isn't the best way to call a SP. As time permits, I've been converting code to use an Adapter of CommandType.StoredProcedure and setting up Parameters. It should be noted that this also does not work using the ODBCConnection and ODBCDataAdapter, but does work fine with SQLClient and SQLDataAdapter.
Crazy... any help would be greatly appreciated.
|
|
|
|
|
i want to speed up the stored procedures and i am now a days in search of ways to increase the performance of my application. Actually the stored procedures fetch a lot of data after joining a lot of tables and so i had to use temporary tables and table variables. Now my plan of action is that i want to stop some of my lengthy stored procedures from making log entries(so that the time for I/O with the log is saved). But i am not finding the proper command to execute from within my stored procedure that can interact with the sql server environment and do the task for me.
Does anyone have a solution to this???
Rohit
|
|
|
|
|
Transaction log entries are made whenever you make a change to your tables: perform an UPDATE, DELETE or INSERT operation. The log entries are required to allow you to roll back your transaction. They are required to allow an administrator to roll forward operations from the last full backup, if, for example, a disk failure occurs. They also permit the database to be rolled back to a specific point in time, if, say, a coding error causes data to be deleted. Finally they're used during database startup recovery, if the database was not cleanly shut down (e.g. system power loss, SQL Server crash).
The transaction log contains the before and after image of each row affected: in the case of an INSERT there is obviously no 'before' image and for a DELETE no 'after' image. To constrain the amount of transaction log required and therefore the amount of transaction log I/O, you need to reduce the number of rows you affect, if possible, and ensure that you update each row as few times as possible.
If you've already done this, and you're sure that transaction log I/O is your bottleneck, you need to take steps to ensure that the physical transaction log I/O is as fast as possible. Firstly, ensure it's on a separate physical disk or array of disks to anything else - the transaction log requires almost entirely sequential write with very occasional reads required when a transaction is rolled back, whereas the database data files are fully random reads and writes. Mixing random and sequential I/O sends the disk head out of position to perform the next sequential I/O, which increases the time taken to write.
Because the transaction log is so vital to the integrity of the database, you should ensure that any RAID array you use has some form of integrity checking to allow the array to be reconstructed on a disk failure: one of RAID 1, RAID 5 or RAID 10. Of these, RAID 5 is generally considered weakest - RAID 5's gains are in random read performance because random I/Os are spread across disks, while all writes must read from all disks (to compute the checksum) and write to two disks (the actual data, and the computed checksum). You can get better write performance from RAID 1 (mirroring, all writes go to two disks), and RAID 5's performance suffers badly when rebuilding an array. Likewise, although RAID 10 allows improved random-access read and write performance, it does nothing for sequential reads or writes.
You may be able to improve performance by using a battery-backed write-back caching disk controller. If the controller guarantees to write back operations in the same order that they were received by the controller, and you are sure that mains power will be restored to the server before the controller's battery runs out, it's safe. If the controller, or you, can't meet these guarantees, you could suffer database corruption. Search Microsoft's knowledge base for "cache controller sql" for more details.
Stability. What an interesting concept. -- Chris Maunder
|
|
|
|
|
i am writing a query where i need to find two dates which are exactly one month and two months less than a given date.
like if date is '2006/05/01'
than result dates should be '2006/04/01' and '2006/03/01' respectively.
pls help
|
|
|
|
|
Use DATEADD[^]
For example:
SELECT DATEADD(MONTH, -1, TheDate), DATEADD(MONTH, -2, TheDate)
FROM SomeTable
Does this help?
"On two occasions, I have been asked [by members of Parliament], 'Pray, Mr. Babbage, if you put into the machine wrong figures, will the right answers come out?' I am not able to rightly apprehend the kind of confusion of ideas that could provoke such a question."
--Charles Babbage (1791-1871)
My: Website | Blog
|
|
|
|
|
|
Hi,
I create my installation project using VS.NET 2005 , then I add the merge modules to the project ! but it doesn't work !
- I also tried to customize the MSDE Internal properties but it didn't work !
any suggestions !!!
hatem++;
|
|
|
|
|
From what I recall, using the merge modules is now not recommended. Microsoft continue to ship the merge modules for applications already using them. See this article[^], specifically the comment at the end:
"PSS Recommendations Regarding MSDE Merge Modules
"Due to rapidly changing threats to the security of our IT systems, we strongly suggest that customers use the MSIs provided by Microsoft and not use merge modules. Using the MSIs provided by Microsoft allow the customer greater flexibility in responding to these threats and making sure that their MSDE embedded applications are secure."
You should consider launching the MSDE setup from your own setup launcher program. You may be able to do this with a custom action. See also Microsoft's MSDE 2000 Deployment Toolkit[^].
IIRC the recommended practice for using SQL Server 2005 Express Edition is to allow setup to create a single SQLEXPRESS instance if one is not already present. All applications should then share that instance, specifying the database file to use in the connection string. Microsoft have belatedly realised that this isn't terribly convenient for developers and are now discussing porting SQL Server 2005 Mobile Edition to Windows XP and Windows Server 2003, from Windows CE - it will be called 'SQL Server Everywhere Edition'. This will be implemented as a set of DLLs and will run in your application's process, rather than as a separate service. I suspect this is the long-term replacement for the Jet database engine used by Access.
Stability. What an interesting concept. -- Chris Maunder
|
|
|
|
|
I would like to know how to rename a certain table through sql script.
I tried to do a stored procedure with the following commands where NewName represents the new table name and PreviousName represents the previous name :
UPDATE [TDM-DataSets].INFORMATION_SCHEMA.TABLES
SET TABLE_NAME = @NewName
WHERE TABLE_NAME = @PreviousName;
The error message i got is :
Ad hoc updates to system catalogs are not allowed.
Does anyone have an idea ?
Thanks,
Clint
|
|
|
|
|
"Rename Table @PreviousName TO @NewName"
We need to graduate from the ridiculous notion that greed is some kind of elixir for capitalism - it's the downfall of capitalism. Self-interest, maybe, but self-interest run amok does not serve anyone. The core value of conscious capitalism is enlightened self-interest.
Patricia Aburdene
Bulls make money, bears make money, pigs get slaughtered.
Jim Cramer
|
|
|
|
|
I put the line in the stored procedure and it doesn't work.
I am working with SQL Server.
it doesn't recognize the command rename.
Thanks,
Clint
-- modified at 11:00 Sunday 30th April, 2006
|
|
|
|
|
Thanks,
I found out :
EXEC dbo.sp_rename 'OldTableName', 'NewTableName'
|
|
|
|
|
Hi,
I have one ObjectDataSource which has a Select query that requires Parameter @CompanyId. CompanyId is Guid and I want to compare it with a bunch of GUID. Can i pass those Guid programatically???
my Query is like..
select a.companyId, b.startdate, b.....
from a,b
where a.companyId IN (@CompanyID)
|
|
|
|
|
hi there
yes you can use this query but remember to pass @CompanyID as a comma separated string of ids.
|
|
|
|
|
Hi,
Thanks for the answer. I found one alternate way to do it as someone suggested me from Asp.net forum.
But, Your reply made me think about the simple mistake i was making. I had provided Comma between values but I did not provide quotes between value. Hence, each guid was compared to entire string i suppose. I'll try to put 'guid','guid' format as well.
Thanks
|
|
|
|
|