I have a stored procedure and one of the fields in the SELECT is a datetime type. On SELECT I am getting the date and the time but no AM/PM returned. The datetime in the query is returned in this format:
Hi;I have been writing a lot of quick and diryt tsql scripts to correctdatabase issues.I've been saving the output of the scripts for future records.I have been executing them through query analyzer.I find the messages query analyzer puts out to be obscuring to myoutput: ie" 6 rows processed "Is there anyway to shut this off?Steve
We're using SQL Server 7 with an Access 2000 MDB as a front end with ODBClinked tables. I recently created a new set of tables for the app, and usersare complaining that unsaved data is being lost when they move to a newrecord. This seems to be the case when there are multiple users. When thereis a single user using it, we don't seem to have that problem.It seems that we had this problem when we first converted from an MDB backend to a SQL 7 back end, years ago, but we haven't had this problem in awhile. These are the first "entirely new" tables created in several years,and we seem to be having that problem again.Is this something with SQL 7 when it's dealing with new tables? Any ideas onwhat to do?Thanks!Neil
I'm a newbie to both Windows Forms development and SQL Server CE, so please bear with me...
I am developing a vocabulary builder application, and my data keeps reverting to the data set I originally imported. I keep losing the records I've added using the application I'm developing, or from within VS2005. I know about the 'Copy To Output Directory' option in the Solution properties and I set this to 'Do not copy'.
What gives? Not sure what other information you might need, but I suspect it's something really basic... Thanks!
. When I copy tables from one database to another (Using DTS Wizard) I lose my settings .. primary keys + default values !! Any help would be appreciated.. . Thanks
This isn't really DB related, except for the connection, but I was wondering if any of you guys had run into this before.
I have a ODBC System DSN Setup for a connection to my MSSQL2K Server. My users will occasionally lose the Default Database setting (Drop-down during DSN Setup), forcing me to go in and reset it. It's getting to be a pain in the ass and I was wonder if any of you guys had run into this before.
Following on from some problems I have been having with flat files, I seem to be stumbling from one issue to another.
I have coded my text file with some very contrived delimiters to ensure that my real data isn't tripping up the package. I am stripping all line break chars, tabs etc.
Now I seem to get all rows imported, but numerous integer columns within the file are being set to zero.
When I toggle the Keep Null options in the data flow task it will either import the row, but with the zeros, or not import the row at all. I have opened the file in text editors, and Excel and it all seems fine - in fact the same source file works OK with DTS/2000.
At a guess it seems as though these columns are seen as null by the package and so with the option switched on it defaults to zero - but they are most certainly not null in the file!! The table is a staging tabe and so is very basic in structure (no defaults or constraints)
SSIS seems a bit buggy or at the very least over sensitive to me - I am at the point of abandonment of it!!
Hi, i am having a problem when i tried to execute a SSIS package from a job in sql server 2005 First of all, when i create a new connection in the designer, i complete with server, user and password, and test connecion succesfull. Then, i execute all the package and it works fine. After building it, i tried tu execute the package using the file that it ve just been created into bin folder, but the connecttions didnt have the tag password, so after user id i add ;password=xxxx in every connection and it works fine. But every time i wantex to execute, i have to write the password. Also i tried openeing the text file that the DTS created, but when i open it again, the password wasnt there. Finally, i tried to create a job that execute this file, and i have to write the password as i did when i execute directly from the file, but i couldnt run. It has an authentication problem. If i open again the step configuracion, the password is not there. The same that happen every time, but now i can run the package
Hi - I am attempting to run an SSIS package through a SQL Agent job step. The package is fairly basic - I run some Execute SQL tasks, build an ADO recordset, enter into a For-Each loop, and perform some table updates based on the ADO recordset. So far so good. Approximately 1/3 of the records in my For-Each loop are processed, then I get a "Failed to acquire connection error". All I am doing for each iteration of the For-Each loop is updating one record in a table based upon the current value that is being processed from the ADO recordset. I am running the job on a 64-bit box, version 9.00.3050.00. I have also run this on two other 64-bit boxes (versions 9.00.1399.06 and 9.00.1406.00) and had the same issue. The owner of the Agent job is a sysadmin on the box. I don't understand why some items would be processed, then the connection drops mid-way through processing. What's even odder is the connection is dropped at the same point every time I run. The connection that I have defined in my package (I only have one connection defined in my package) is Provider type "Native OLEDBSQL Native Client", and I am using Windows authentication. The version of Visual Studio that I have developed this package in is 8.0.50727.42. Please help.
I have written a little bit of VB.NET code that basically takes three strings, transforms them, and returns a single string to be stored in my table.
I am running into a strange problem, however... for some reason, a number of my processed rows are missing a pair of double quotes (").
The vast majority of the records are formatted properly, and have the double quotes in the expected locations.
The most frustrating thing about it is that I have included the offensive input strings in my Test.sql test script, and when I step through the entire routine, the return value is perfect...
i apologize for being the worst ever at posting questions here, please let me know if i can add anything
I am facing a wierd problem while sync'ing. When I am do a synchonize from a mobile device to the server DB, one of my table column fails to update while all other columns get updated.
The table uses row level tracking (tried making it column level too) and the field which fails to update is a date field (Nullable).
This happens in case of syncing after updating >1 records on device. But syncing after updating just 1 record on device, this date field gets updated as expected.
Note: I have filters set for this table. The filter downloads fields only with this date field=null.
I need to downlaod all records meeting this filter condition but at the same time, this filter should not be applied while uploading changes as I think this is what the problem is.
If anyones has faced this and got a solution, please let me know.
We have several servers running SQL Server 2005. Each of these were upgraded from SQL Server 2000. Each server has about 50 databases on it. Each database has a table with the same name, containing login information to the database that the table resides in.
We have a VB 6.0 application that allows our Help Desk to do a SELECT on the table that contains login information. All it does is a SELECT; no UPDATEs, DELETEs, INSERTs, or DDL operations are performed by this application. The application logs into the databases using a Security Group login (let's call it MyGroup). Each member of the Help Desk team is a member of MyGroup.
About twice a month, the MyGroup security group loses SELECT permissions on all of the databases on a server. The server affected usually is different each time, but sometimes the same server will be affected two times in a row.
So far, we just run a script to update the SELECT permissions on the table in each of the databases, and this takes care of the problem (for now). But the problem seems to be recurring regularly.
What would make the server lose all permissions for a particular security group? The other logins continue to work fine. Only this one security group seems to be affected.
Hi, I'm working on a project that connects to a database on every view of a site. It works fine for a few hours but eventually it just stops connecting to the database. There's nothing in the application logs to suggest where to look, it seems like the connections are just dying. I am creating the connection in a pretty normal manner:SqlCommand myAwesomeCommand = new SqlCommand();myAwesomeCommand.Connection = new SqlConnection(myAwesomeConnectionString);myAwesomeCommand.CommandText = myAwesomeSQL;myAwesomeCommand.CommandType = CommandType.Text; // Otherwise known as awesomeText myAwesomeCommand.ExecuteScalar(); I also use a SqlDataAdapter and fill it for some of my stuff for grabbing more complex data, but all happy System.Data.SqlClient stuff. But yeah, the thing seems to just give up after a few hours. Everything I have read about connection pooling in ADO.Net says that the connections won't be reused. And yes, I believe I am closing all of my connections after I use them. The site is getting, oh, 7500 hits an hour maybe? Somewhere around that, maybe more maybe a little less. Anyone have any ideas?
When I a set of databases from backup all login information is lost. The front end of the application provides a way to create new users, but it is a very long process. I am looking for a script or stored procedure that will help attach the users to the database.
I backed up and restored a database successfully except for the users. The only user that shows is dbo, but I can't add a user with the proper user name from my old server; I get the error that the user or role already exists in the current database.
When I go to role, name of public, there is the user I need to add. I can't remove it as it's a table owner, but I need to add it to the list of users, so I need to remove it then re-add it.
I tried using EXEC sp_changeobjectowner to no avail.
I have 2 machines, each running the same version of SQL Server. I'm trying to use the dts to export my data from one to the other, and it works fine, except for the fact that I lose my unique index on a field. Is there a way I can keep this from happening, or am I going to have to recreate it each time? I wouldnt think I would have to, seems like bad design to me.
I'm testing the use of application roles for security. The customer I work for has still a lot of ASP intranet applications running. We're migrating the databases to a SQL Server 2005 server.
I've changed the connection string to a user without any permissions but to log on. After that I use an application role for permission to select different tables and to execute Stored Procedures.
The first queries do execute but after that I get "Permission denied", like I haven't got the application role anymore.
On my SSIS package I have 2 connection managers, both are connecting to an Oracle db. I enter in the ID and Password then click 'test connection' I'm able to connect to the database fine. I then go to a data flow of one of my control flow task. Open up my OleDb Source and click preview. The SQL query executes with no issues. I can see the data from my connection manager source.
I then go and run the package and I get this error message:
[OLE DB Source [1]] Error: SSIS Error Code DTS_E_CANNOTACQUIRECONNECTIONFROMCONNECTIONMANAGER. The AcquireConnection method call to the connection manager "OracleConnection" failed with error code 0xC0202009. There may be error messages posted before this with more information on why the AcquireConnection method call failed.
I've been racking my brain on this for a few days and still no luck on getting this to work.
We got a server running SQL server 2000 SP4 with a database that€™s being replicated There are 17 subscribers running MSDE SP4 using merge replication. Replication is started manualy
Initially we tested this with two subscriptions an everything went well, but now, since 3 months, we are facing a weird problem while sync'ing. We have massive data loss on records that where inserted at the subscribers. Records seem to disappear, but only record that have a foreign key constraint. What I mean is that for example a record is inserted at the table that holds our client records with primary key €˜ClientID€™ and then a record is inserted in a table with actions for that client with a foreign key €˜ClientID€™ referring to the client table. After sync€™ing that client record is inserted correctly in database on the publisher but the records in the table with actions are gone.
As far as I know the tables are correctly formed with identity set not for replication and so on. Shortly, I can€™t find any problem, a specially when it doesn€™t happen always.
If anyones has faced this and got a solution, please let me know. Thanks.
1. Create Temp Table (Execute SQL) 2. Load Temp Table (Execute SQL) 3. Data Flow Task using the temp table 4. Data Flow Task using the temp table 5. Drop Temp Table (Execute SQL)
Tasks 3 & 4 are currently setup to run parallel/concurrently - meaning they both branch directly off of task #2. They both also lead to task #5. Also, my connection manager is set to retain its connection.
Here's the problem. When executing this flow, task #3 will sometimes fail, other times it will succeed - without me changing anything between runs (I simply end debugging and run it again). When it does fail it is always because it cannot find the temp table. One thing to note is that task #4 (running at the same time as #3 and using the same temp table) always finishes first because it does quite a bit less work. Last thing to note: if I set up task 4 to run after 3 it works everytime, but of course it takes much longer for the total package to execute.
How is it that the task can find the temp table on some runs, but not on others? If this is just quirky behavior that I am going to have to live with for the time being, is there a way to force a task to re-run X number of times before giving up and moving on?
I've experienced a problem (bug?) when deploying reports to the report server. My users kept complaining about losing authentication to reports when I deploy a new report. I found the problem but don't understand why this is happening. My collegue developped the earliest reports and loggs in with a different user on Remote Desktop to develop the reports. When the report is deployed, the datasources are deployed also.
Now when I log in on Remote Desktop with another user then my collegue and open the same report project, and check out the datasources, the credentials are gone ... ?!? So when I deploy a report the datasource with the empty credentials are deployed too, wich causes all the authentication problems.
My Question: How comes I can't see the user credentials my collegue filled in? When I open the report project the credentials are gone. I log out, my collegue loggs in, and the credentials are back
I suppose this has something to do with Remote Desktop and the users connecting to it?
I have written a simple SQL Server 2005 package to pull some data from Oracle (using ODBC) and pumping it into SQL Server. When I run it from the server in debug mode in VS it works fine. When I schedule the job it errors out with "ora-01005: null password given; logon denied." The password is there. Has anyone experienced this? Is there a security setting somewhere preventing me from saving passwords? Is there a work around? Thanks.
Hi report data sources keep losing their credentials. We use custom datasources and store credentials on the server yet for some reason? the credentials get removed and have to be re entered.
I have a VB6 application that has MSDE as its current backend. In testing moving to SLSQLExpress I've noticed, since this application is run on laptops, that if I disconnect the network cable while I'm in the VB6 app I lose "connection" to the database. If I end task on the app and restart it it works fine. If I reconnect the network cable while the app is running and then disconnect it the app runs fine. Also this happens when I'm connected to a wireless network and then disconnect and disable that connection.
In my template.ini file I have DISABLENETWORKPROTOCOLS=0. Is there a setting I'm missing? I can't seem to find anything else to set in the ini file.
I'm adding data to a text column, and whenever I have a backslash at the end of a line it disappears. Here's an example: Code:
INSERT INTO MyTable (TextCol) VALUES ('some text some more text yet more text')
The on the first line is fine- the on the 2nd line just disappears. If I add a 2nd backslash on the end of the line, one is inserted. If I add a space to the end of the line, everything works as normal. I can fix this client-side, but before I do I'd really like to know what's going on?
This is an infrequent problem but very annoying when it happens. Every so often, I go into Enterprise Manager only to find that all of my SQL server registrations have disappeared from the group and I have to add them all again. As it happens so infrequently I don't know if there's something I'm doing on my machine to make them all disappear.
I'm fairly new to MS SQL & I have a problem copying tables from one d/b to another. When I do an import or export select Copy tables, the tables & the data get transferred, but all the Identities are set to NO. My D/Bs have 188 tables & it's a pain to reset all of them manually.
Is there a setting I don't know about that will transfer the tables & maintain the Identity? (I'm using SQL Server Management Studio 9.0)
We have a clustered sql server 2000 installation on windows 2000. Werecently lost the select into/bulkcopy option from a database. Themost likely candidate appears to be a failover just before - afterthis happened the setting disappeared. Any ideas?ThanksTom
I have a staging table that has a float [DT_R8] column that shows 4 decimal places. When I use an OLE DB Source referencing that table to go to an OLE DB Destination referencing a table with an identical column the data gets rounded to a single decimal place. Needless to say this is really messing with the values.
I can see the Scale property in the Advanced Editor for OLE DB Destination but I cannot change it. Same for the OLE DB Source.
Oh, and if I do an insert using SQL in Management Studio I have no problem getting the 4 decimal places in. For example:
Insert into table2 (Col1, Col2) select Col1, Col2 from table1
Moves all the data and keeps the 4 decimal places.
I´m having some trouble converting values represented as strings to the decimal data type. I have a flat file source, from where I read some currency rates represented without decimals. Before sending those values to my SQL Server destination, I want to convert them to represent correct values.
An example to clarify:
If my source contains a column named "curr_rate" with the value 000092500 I want to send it to my destination as 9,2500.
So I set up a Dervied column component, converting my value like so:
((DT_NUMERIC,9,4)curr_rate)/10000
My problems is that the precision is lost, and all that´s sent to my destination table is 9,0000.
How should I go about to convert my strings without losing precision in the process?
We are using Reporting services 2005 and have been for some time. Recently we have started using the subscriptions as well. However we have come across a problem with this.
Currently we are updating our report templates by removing them and then creating them again. This worked fine until we started using subscriptions. Now, when the reports are removed, the subscriptions that are associated with the reports are also removed.
Is there a way to update a report template, without having to remove the report first? And can this be done programmatically? The code is currently publishing the reports by calling ReportingService2005.CreateReport(name, "/" + reportFolder, false, template, null);