I want to load this data into SQL Server tables so that the data will be imported without the trailing spaces(blanks) after each data element .
Thus I will like to trim the data of the extra spaces (blanks) before import.
With DTS, the default behavior is to import blanks ("") in a data file as nulls in the table.
However, in SSIS, the default behavior is to import blanks as blanks.
I need to ensure that the packages I'm migrating import the data in the exact same way. I am wondering what is the easiest way to convert blanks to nulls?
Right now I'm using a script to convert the blanks to nulls before they are written to the table, but just wondering if there's an easier way.
A view named "Viw_Labour_Cost_By_Service_Order_No" has been created and can be run successfully on the server. I want to import the data which draws from the view to a table using SQL Server Import and Export Wizard. However, when I run the wizard on the server, it gives me the following error message and stop on the step Setting Source Connection
Operation stopped...
- Initializing Data Flow Task (Success)
- Initializing Connections (Success)
- Setting SQL Command (Success) - Setting Source Connection (Error) Messages Error 0xc020801c: Source - Viw_Labour_Cost_By_Service_Order_No [1]: SSIS Error Code DTS_E_CANNOTACQUIRECONNECTIONFROMCONNECTIONMANAGER. The AcquireConnection method call to the connection manager "SourceConnectionOLEDB" failed with error code 0xC0014019. There may be error messages posted before this with more information on why the AcquireConnection method call failed. (SQL Server Import and Export Wizard)
Exception from HRESULT: 0xC020801C (Microsoft.SqlServer.DTSPipelineWrap)
- Setting Destination Connection (Stopped)
- Validating (Stopped)
- Prepare for Execute (Stopped)
- Pre-execute (Stopped)
- Executing (Stopped)
- Copying to [NAV_CSG].[dbo].[Report_Labour_Cost_By_Service_Order_No] (Stopped)
- Post-execute (Stopped)
Does anyone encounter this problem before and know what is happening?
I am attempting to import data from Microsoft Access databases to SQL Server 2000 using the DTS Import/Export Wizard. I have a few errors.
Error at Destination for Row number 1. Errors encountered so far in this task: 1. Insert error column 152 ('ViewMentalTime', DBTYPE_DBTIMESTAMP), status 6: Data overflow. Insert error column 150 ('VRptTime', DBTYPE_DBTIMESTAMP), status 6: Data overflow. Insert error column 147 ('ViewAppTime', DBTYPE_DBTIMESTAMP), status 6: Data overflow. Insert error column 144 ('VPreTime', DBTYPE_DBTIMESTAMP), status 6: Data overflow. Insert error column 15 ('Time', DBTYPE_DBTIMESTAMP), status 6: Data overflow. Invalid character value for cast specification. Invalid character value for cast specification. Invalid character value for cast specification. Invalid character value for cast specification. Invalid character value for cast specification.
Could you please look into this and guide me Thanks in advance venkatesh imtesh@gmail.com
I have a table that I need to delete some data from and put the deleteddata into a different table.How do I script the following.If Field1 in Table1 is null, remove that row from Table1 and put it ina new table called Table2Regards,Ciarn
I am not sure how to implement the following, but I believe it entails using DTS, and hopefully it is fine that I post it here b/c ultimately I will need this backend data for my frontend .aspx pages:
On a weekly basis, I need to IMPORT some data located on a remote Oracle DB into SQL Server 2k. Since there is so much data to transfer, I would only like to transfer the data that is new to the table since the last IMPORT, i.e. a week ago and leave behin the OLD data.
Is DTS the correct way to go or do I have more control via DTS with STORED PROCEDURES? Does anyone have any good references for me?
On a similar note, once this Oracle data is IMPORTED into a certain table, I would like to EXPORT some of these NEWLY acquired rows matching certain criteria into another table for auditing purposes. For this scenario, should I implement a TRIGGER UPDATE event here on the first table?
I am running a query on multiple tables and the data I get back consists of several repeated rows but with one column different. I want to take out those repeated rows and for the column that is different join that data and separate it by a comma. Can this be done?
Ex. Cindy Lair 111 Drury Circle Harrisburg Pennsylvania 717 Cindy Lair 111 Drury Circle Harrisburg Pennsylvania 610 Cindy Lair 111 Drury Circle Harrisburg Pennsylvania 310
So i would like this data to come up as: Cindy Lair 111 Drury Circle Harrisburg Pennsylvania 717,610,310
Seems obvious but I can't see how. How would I remove columns from a data flow so that columns which have been used earlier but are not needed for insert/update are taken out of the flow.
I'm asking because the data ends up in a update statement and the flow has got so big it is unreadable.
I was wondering if someone could tell me how to use an SQL Function to remove everything from the mess of characters below except 'Test.' Really, any message could be substituted, so it's not as easy to leave everything except the 'Test.' string.
{ tf1ansiansicpg1252deff0deflang1033{fonttbl{f0fnilfcharset0 MS Shell Dlg 2;}{f1fnil MS Shell Dlg 2;}} {colortbl ; ed0green0lue0;} {*generator Msftedit 5.41.15.1507;}viewkind4uc1pard x720cf1f0fs20 Test.f1par }
It's RichText.
The fs20 could be fs24, fs25, etc. (any numbers). The end of the number would be the point to trim the beginning off.
The '' seems like it would be the point to start trimming the end off.
I've been experimenting with SUBSTRING together with CHARINDEX, but am just not getting even close to coming up with a solution.
I have a table with several columns of information that I wish to set up some form of schedule to go through this data and remove any special characters that may interfere with other code processes.
Mainly the coma's and the apostrophes. It really messes with my asp pages and scripts when retrieving this information and trying to do other things with it, so I need to figure out how to remove these from the tables so it does not cause these issues.
Knowing this, I cannot figure out how to keep the data in the row/column and just extract the special characters from that data. The other problem is, everything I try requires me to insert either a coma or apostrophe as part of the code string which in lies my issue.
How can I parse through my data, leave the data as-is, but just get rid of coma's, apostrophes, and double quotes?
Does anyone have a basic example that I can use to expand on?
I have a large (420GB) database that has never had data archived off before. I taken a backup to a test server and run a script supplied by the product vendor which has removed a large ammount of old data no longer required.
I have checked within enterprise manager that this data has now gone, however the actual file itself has not shrunk in size. Is there a further step I need to take to get back the space.
About a month ago I restored a data base that had been originally created in MSDE into a full version of SQL Server 2000. I thought that the size restriction would automatically be removed by putting it into the full version of SQL. I was wrong and about a week or so later the data base reached the 2GB size and needed attention. I received some assistance through Microsoft and they walked my client through the process of removing the size restriction. Since then everything has been OK.
Now I am doing something similar. My client has a data base that was originally created in MSDE. We upgraded them to SQL Express and the data base has now grown to 4GB which is the max that SQL Express allows.
We will be installing SQL for Workgroups this week.
My question is this. Is there a setting I must change inside the data base or in SQL for Workgroups that will allow the data base to grow beyond 4GB? We need to let it expand to whatever they need and I cant seem to find any documentation on whether or not I have to change a setting for this data base.
I have found a bunch of duplicate records in our housing database that ideally I need to delete.There are two tables that I need to remove data from ih_cml_log_entry and ih_cml_log_notes. There is no unique identifier between the tables for a log entry. So I have had to join on the person_ref, log_seq and the date/time of entry.How do I go about deleting the data - I've used the script below to identify what I need to delete -
SELECT * FROM ( select cml.person_ref, cml.open_date + open_time as 'datetime',cml.open_user,cml.log_type ,ROW_NUMBER() OVER (PARTITION BY cml.person_ref, cml.open_date + cml.open_time,cml.open_user,cml.log_type ORDER BY (SELECT 0)) AS RowNo ,n.note FROM ih_cml_log_entry cml
I wrote a simple data flow with an OLE DB source and destination. I do a direct mapping of the columns ( colA - > colA) with no transformations needed. I found that colA on the destinatin does not allow NULLS (required by the program that accesses that database) while colA on the source supports and has NULLs, Is there any accomodaiton for handling NULLS (like mapping them to blanks) in the direct copy approach or do I need to read each row and test colA_ISNull?
Does having a lot of blanks in a column cause errors ?
In 2 or 3 packages where I get 95% of the data in the Error Colomn I can find nothing wrong with the data at all ? it all either seems perfect or there is no data in the column in question usually I would have maybe 800 rows where data would have been inserted but in the other 40, 000 rows the column is blank
Using the "Retain null values from the source as null values in the destination" doesnt seem to make a differance.
This is the error description
The data value cannot be converted for reasons other than sign mismatch or data overflow.
Anyone know of a solution / reason why this keeps happening
We have deleted 120GB of data but space did not released even after 2 days. Is there any reason for this? tell me how exactly it releases the space after truncating a 120GB table?
I have two tables one list changes of hospital ward and one lists changes of consultant doctor. These can change independently ie a ward change can occur without a consultant change and vice versa. I want to summarise these changes to give the status at each date_serial value.
select date_serial,consultant_id,null as ward_id from #temp_consultant_episode union select date_serial,null as consultant_id,ward_id from #temp_ward_stay
declare @error int, @rowcount int select @rowcount = COUNT(1) FROM STG_BCDR; while @rowcount > 0 begin BEGIN TRAN Deletion
[code]....
Above code i try to delete records batch by batch to avoid table locking at BCDR table.total records in this BCDR table is 40,000 records. However I run the code at execution plan, the BCDR table still clustered index scan which means that the locking still happend.
If i change the delete top (5000)...... to delete top (5).... then thre is clustered index seek, which is good..The problem here is each time only delete top 5 records which is means it will realy take very long time to remove those data.
how to cater the situation inorder for me to delete those huge data without table locking happend. If table locking happend , then other user will not be able to access this table at the same time.
It seems that there should be a solution for my situation, but for the life of me I can't seem to figure it out.
I need to compare two "like" tables, containing similar data. Tbl 1 is "BOOKED" (which is a snapshot of inventory) and tbl 2 is "CURRENT" (the live - working inventory table). If I write my query as follows the the subsequent result is "duplicate" data.
Code Block SELECT booked.item, booked.bin, booked.quantity, current.bin, current.quantity FROM BOOKED LEFT JOIN CURRENT ON booked.item = current.item
No matter what type of join I use, there is duplicate data displayed for each table. For example, if there are more bins in the BOOKED table that contain a certain product then the CURRENT table will repeat data and vica versa.
As follows:
Item Bin Quantity Bin Quantity
12345 A01 500 A01 7680
12345 B01 6 A01 7680
12345 C01 20 A01 7680
54321 G10 1032 E15 1163
54321 G10 1032 F20 523
54321 G10 1032 H30 750
98765 Z20 7000 Z20 8500
98765 Y15 2500 Y15 3000
98765 X10 1200 Y15 3000
What I would like to do is display Bin and Quantity only once and the repeating values as NULL or [BLANK]. Or, to display all of the bins from both tables and only the quantities from each table in relation to the bin found in that table, returning a "0" if no quantity exists.
This is what I'm after:
Item Bin Quantity Bin Quantity
12345 A01 500 A01 7680
12345 B01 6 B01 0
12345 C01 20 C01 0
54321 G10 1032 E15 1163
54321 F20 0 F20 523
54321 H30 0 H30 750
98765 Z20 7000 Z20 8500
98765 Y15 2500 Y15 3000
98765 X10 1200 X10 0
Is this possible? If so, how?
I also might add that it is ok for each table to contain multiple entries for any given item. This is basically being requested as an inventory variance report - inventory before physical count and immediatly after physical count - and will only be run once a year.
----------------------------------------------- Just thinking out loud here: What if I created three subqueries, the first containing only BOOKED information, the second containing only CURRENT information and the third being a UNION of both tables? Something like this:
Code Block SELECT q3.bin, q1.item, ISNULL(q1.quantity, 0) as QTY_BEFORE, ISNULL(q2.quantity, 0) as QTY_AFTER
FROM
(select item, bin, quantity from BOOKED)q1 Left Join
(select item, bin, quantity from CURRENT)q2 on q1.item = q2.item Left Join
(select bin, item from BOOKED UNION CURRENT)q3 on q1.item = q3.item
Order By q1.item
I don't know if I wrote the UNION statement correctly, but I will have to try this when I get back to work...
I'm trying to integrate SQLEXPRESS in a custom application ("the app"). On installation, a new instance ("myNewInstance") is created, and a new DB ("myDB") is created on that instance.
Upon de-installation of "the app", I remove the new instance. To that end, I'm calling the SQLEXPRESS setup like this:
Code Snippet
setup.exe /qb REMOVE=SQL_Engine INSTANCENAME=myNewInstance The Problem: While the service and the instance are removed, the instance's data directory is not, and the DB Datafiles are still there. Now if I reinstall the app (re-creating myNewInstance), it uses the same directory structure, and when I try to re-create the DB, I get the following error message:
Code Snippet
Msg 5170, Level 16, State 1, Line 1 Cannot create file '[...]MSSQL.2MSSQLDATAmyDB.mdf' because it already exists. Change the file path or the file name, and retry the operation. So here's the Question: is there a) any way to tell setup.exe to completely remove the datafiles when uninstalling the instance, or b) any way to tell TSQL to overwrite the old datafiles if they exist?
I posted this scenario earlier and ndinakar said my code worked on (I guess) his own machine but has not worked on mine! Does that not sound funny? I need to ge through with this simple but frustrating stuff in time. I've implemented more complex snippets before this but I just can't figure the problem out.
Can anyone out there tell me what I could be doing wrongly? I wrote a simple stored procedure called 'proc_insert_webuser' in a database that has been moved to a remote server called LAGOS-NTS3. I executed it from SQL Query Analyser and it worked. When, however, I executed the same stored procedure from ASP.Net it gives an impression that a new record has been inserted and even displays the newly generated ID column to me. On checking the base table I found only blank columns with only the ID column having the generated IDs!
I'm perplexed. I checked the code and can't possibly spot the error. Can anyone help. Found below are snippets used.
CREATE PROCEDURE proc_insert_webuser ( @applicantidnumeric(18) output, @firstnamevarchar(18), @midname varchar(18), @lastname varchar(50), @secretcodevarchar(18), @my_errnumeric(18) output ) AS IF (SELECT COUNT(ApplicantID) FROM tbl_webuser WHERE Firstname = @firstname AND Midname = @midname AND Lastname = @lastname) = 0 BEGIN begin transaction INSERT INTO tbl_webuser (Firstname, Midname, Lastname, SecretCode) VALUES (@Firstname, @Midname, @Lastname, @SecretCode) commit transaction SET @applicantid = @@IDENTITY END ELSE BEGIN RAISERROR('Error! Execution aborted. A matching user profile exist.', 16, 1) SELECT @my_err = @@ERROR END RETURN
Then the ASP.Net stuff:
Sub PostData()
'connection string production
Dim cstring As String = "User ID = sa; Password = ; database = E-Recruitment; Server = LAGOS-NTS3; Connect Timeout = 60"
'connection instantiation
Dim cnx As SqlConnection = New SqlConnection(cstring)
Try
'open connection
cnx.Open()
'instantiates and execute a command object
Dim cmd As SqlCommand = New SqlCommand
With cmd
.Connection = cnx
.CommandText = "proc_insert_webuser"
.CommandType = CommandType.StoredProcedure
Dim param1 As New SqlClient.SqlParameter("@applicantid", 0)
param1.DbType = DbType.Single
param1.Direction = ParameterDirection.Output
.Parameters.Add(param1)
Dim param2 As New SqlClient.SqlParameter("@firstname", txtFirstNM.Text)
param2.DbType = DbType.String
param2.Direction = ParameterDirection.Input
.Parameters.Add(param2)
Dim param3 As New SqlClient.SqlParameter("@midname", txtMidNM.Text)
param3.DbType = DbType.String
param3.Direction = ParameterDirection.Input
.Parameters.Add(param3)
Dim param4 As New SqlClient.SqlParameter("@lastname", txtLastNM.Text)
param4.DbType = DbType.String
param4.Direction = ParameterDirection.Input
.Parameters.Add(param4)
Dim param5 As New SqlClient.SqlParameter("@secretcode", txtPWD.Text)
param5.DbType = DbType.String
param5.Direction = ParameterDirection.Input
.Parameters.Add(param5)
Dim param6 As New SqlClient.SqlParameter("@my_err", 0)
param6.DbType = DbType.Single
param6.Direction = ParameterDirection.Output
.Parameters.Add(param6)
.ExecuteNonQuery()
'pick the auto-number
Dim strAppID = .Parameters("@applicantid").Value
'notifies the user of record success
lblMessage.Text = "Profile created. Your ApplicantID is : " & strAppID & " Kindly take note of your UserID. You'll need it for future logins."
End With
Catch ex As Exception
'notifies the user of any error(s)
lblMessage.Text = Err.Description & " originating from " & Err.Source
Finally
cnx.Close()
End Try
End Sub ==================
I'm calling this procedure from the click event of a command button. What could be happening to the code? I have done similar things and got positive results! Any hint from anyone out there?
A table was created in version 6.5 that has 2 blank spaces and a slash '/' in it's name (i.e. Item Type w/Groups). This was done in error and now the table cannot be dropped or renamed and dropped.
Does anyone know how I can delete this table? Please let me know.
I'm trying to create a view from a table in which I want to concatenate to columns to a new column. The concatenation part works fine when data "supports" calculations, i.e. no nulls. The data however is "infected" in two ways I have Null-values and I have blanks alongside with some real information. I concatenate via a CASE statement but I can't figure out how to take the 3 situations into account (nulls, blanks and data) and create one meaningfull column in my view.
The result should be column1 + "-" if column2 is null or blank column1 + "-" + column2 if column2 has meaningfull data
Both the involved columns are text (varchar).
If anyone has been in this situation before could you please provide som code that handles the situation.
The result could be a password like "0l$IE" and I the save this password to an SQL server. The problem is however that with the generated password five blanks are also added. Instead of "0l$IE" I get "0l$IE ". I hope you understand what I mean? Is there an easy way to correct this?
I have an Excel spreadsheet with a field that may contain blanks (empty). When I filter on that field to show only the blanks, I get 4000 records returned. But if I import the spreadsheet into SQL Server, the test in there for blanks ('') returns nothing. However, if I test for Nulls in that field, I get 5000 records returned. I don't know which one (Excel or SQL Server) is giving me the right answer. Can anyone help me understand what's going on?
create table #datatable (id int, name varchar(100), email varchar(10), phone varchar(10), cellphone varchar(10), none varchar(10) ); insert into #datatable exec ('select * from datatable where id = 1') select * from #datatable
I still want to retrieve the data and present it in the software's presentation layer but I still want to remove the temp table in order to reduce performance issue etc.
If I paste the code DROP
TABLE #datatable after insert into #datatable exec ('select * from datatable where id = 1')
Does it gonna work to present the data in the presentation layer after removing the temp table?
The goal is to retrieve the data from a stored procedure and present it in the software's presentation layer and I still need to remove the temptable after using it because I will use the SP many time in production phase.
When I enter a data in a table, SQL Server automatically completes the data with blanks up to length of column. this happens in a web form also in Management Studiıo, Whan am I doing wrong ? does Collation have anything with it ? SQL Server 2005 / Developer Edition
So the query I am looking for would basically add some values to the results where they meet the right criteria.
I am running the query through a view so I dont actually want to add the values to the physical tables only to the view of the results if you understand what I mean?