Transact SQL :: Additional Column With Bulk Insert?
Aug 11, 2015
Need to know a mode whereby somehow I can every time insert an additional column in a table while bulk inserting data to an existing table from a new flat file thus identifying the file from which, or the time when, the data was inserted in an existing table.
View 2 Replies
ADVERTISEMENT
Jun 5, 2015
I try to import data with bulk insert. Here is my table:
CREATE TABLE [data].[example](
col1 [varchar](10) NOT NULL,
col2 [datetime] NOT NULL,
col3 [date] NOT NULL,
col4 [varchar](6) NOT NULL,
col5 [varchar](3) NOT NULL,
[Code] ....
My format file:
10.0
7
1 SQLCHAR 0 10 "@|@" 2 Col2 ""
1 SQLCHAR 0 10 "@|@" 3 Col3 ""
2 SQLCHAR 0 6 "@|@" 4 Col4 Latin1_General_CI_AS
[Code] .....
The first column should store double (in col2 and col3) in my table
My file:
Col1,Col2,Col3,Col4,Col5,Col6,Col7
2015-04-30@|@MDDS@|@ADP@|@EUR@|@185.630624@|@2015-04-30@|@MDDS
2015-04-30@|@MDDS@|@AED@|@EUR@|@4.107276@|@2015-04-30@|@MDDS
My command:
bulk insert data.example
from 'R:epoolexample.csv'
WITH(FORMATFILE = 'R:cfgexample.fmt' , FIRSTROW = 2)
Get error:
Msg 4823, Level 16, State 1, Line 2
Cannot bulk load. Invalid column number in the format file "R:cfgexample.fmt".
I changed some things as:
used ";" and "," as column delimiter
changed file type from UNIX to DOS and adjusted the format file with "
" for row delimiter
Removed this line from format file
1 SQLCHAR 0 10 "@|@" 2 Col2 ""
Nothing works ....
View 7 Replies
View Related
Apr 18, 2008
Hello,
I'm just learning SSIS and I've hit my first bump. I am doing a bulk import from a tab delimited text file to an empty sql table that has a Idendity column defined. How do I tell the bulk insert task to skip that column when inserting from the text file. If I remove the identity column it imports the data fine, but I want to create the indentity column in the table too.
Thanks.
View 8 Replies
View Related
Nov 12, 2007
Hi! This is my trigger and I'd like to insert the date of today in Column DeletedDate. This trigger is in tblA. tblA and tblB both had the same number of columns and same fields, but I just added another column to tblB called deletedDate and i'd like to insert the date along with the other data. Thanks!!!
Insert into tblB SELECT* FROM Deleted
View 2 Replies
View Related
May 15, 2015
I have a file which has some wind data that i am trying to import into a sql data base through bulk insert. if the script works as it supposed i should see 144 rows impacted but i see 0 rows affected.
BULK INSERT TOWER.RAWINTERFACE_1058 FROM 'C:Temp900020150427583.txt'
WITH(CHECK_CONSTRAINTS,CODEPAGE='RAW',DATAFILETYPE='char',FIELDTERMINATOR=' ',ROWTERMINATOR='
',FIRSTROW=172)
The code works if the file is large but if its small 0 rows are affected. and also if i remove the header rows then the file works again. want to understand what is going on here. i am including the screen shot of the file in notepad++. I have tried changing the row terminator to ' ' , ' ' and also tried to change the codepage but nothing seems to work. No error file is being generated either, if i give a error file option.
View 7 Replies
View Related
May 16, 2012
I need fmt(format ) file for below values
“Stuid”,”Stuname”,”Class”,”DOJ”,”English”,”Math”,”Science”
"S1","Ram","10/31/2011,Monday",40,32,50
"S2","Bala","10/31/2011,Monday",50,45,69
"S3","Sam","10/31/2011,Monday",74,78,79
"S4","Jon","10/31/2011,Monday",65,58,89
"S5","Jos","10/31/2011,Monday",41,25,69
"S6","Jim","10/31/2011,Monday",74,41,41
"S7","Jack","10/31/2011,Monday",98,57,47
"S8","Sate","10/31/2011,Monday",87,73,45
"S9","Brb","10/31/2011,Monday",47,89,65
"S10","Jom","10/31/2011,Monday",14,100,47
View 15 Replies
View Related
Nov 20, 2015
SQL Server 2012
I want to be able to run the following command from SSMS (as an ad-hoc query).
BULK INSERT Database_Name.dbo.Table_Name FROM 'serverfile.txt' WITH (FIELDTERMINATOR = '|', ROWTERMINATOR = '0x0a', MAXERRORS = 0);
When I do I get:
Msg 4861, Level 16, State 1, Line 1
Cannot bulk load because the file "serverfile.txt" could not be opened. Operating system error code 5(Access is denied.).
I have full access to the file.I can do the same command successfully if the file is stored on a local drive on the server.
According to my DBA I can not run it with a remote file location because I don't have the SA permission. His solution is for me to create a job that runs the command. I have done so and the job works correctly.
Is he correct that there is no way for me to be able to run it from SSMS without SA permissions?
View 5 Replies
View Related
Nov 5, 2010
I have a CSV file that I am trying to bulk load into a temp table. The data in the file is all jumbled together, as in, there does not appear to be a row terminator. However, I do see a bunch of little rectangular boxes that I assume are the row terminators.
When I run the bulk insert, the data is treated as one string. For example... If I have 10 columns in the table, the 10 columns will be populated, but the remainder of the data is dumped into the last column.
Here are the row terminators I have used so far that haven't worked.
,
,
,
, CRLF,
View 31 Replies
View Related
Aug 5, 2015
USE TEST
GO
/****** BULK INSERT ******/
BULK
INSERT [Table01]
FROM 'C:empdata.csv'
[code]....
I am using above code to insert csv file data which consist of arabic data as well. Upload is successful however Arabic field data is uploaded with invalid characters and getting the following error Msg 4864, Level 16, State 1, Line 3...Bulk load data conversion error (type mismatch or invalid character for the specified codepage)
View 15 Replies
View Related
Sep 15, 2015
How do I do a bulk insert into a temp table from a text file. Text file looks like that:
ver_id TYPE
E57AB326-803C-436E-B491-398A255C919A 58
34D2A601-6BBA-46B1-84E1-3A805FDA3812 58
986E140C-62F1-48F1-B428-3571EBF00DA0 58
My statement looks like this:
CREATE TABLE [dbo].[tblTemp]([ver_id] [nvarchar](255), [TYPE] [smallint])
GO
BULK INSERT [dbo].[tblTemp]
FROM 'C:v2.txt'
I keep receiving errors.
The error I receive is: Bulk load data conversion error (type mismatch or invalid character for the specified codepage) for row 1, column 2 (TYPE).
View 2 Replies
View Related
May 14, 2015
I am using a BCP format file to import a CSV file. The file looks like the following:
"01","02"
The format file looks like the following:
6.0
2
1 SQLCHAR 0 0 """ 0 ""
2 SQLINT 0 0 "","" 1 MROS
3 SQLINT 0 0 ""
" 2 MROF
When both the two fields are set to SQLCHAR data types the data imports successfully without the quotes as 01 and 02. These fields will always be numbers and I want them as integers so I set the data type to int in the database and SQLINT in the format file. The results was that the 01 became 12592 and the 02 became 12848. where these numbers are coming from?
View 7 Replies
View Related
Dec 3, 2015
I am running Microsoft SQL Server 2012 SP on a Windows Server 2008 R2 Standard SP1 box. The SQL Server service is running as a simple windows domain user (nothing special, no admin rights, etc.) I am having some issues with using Bulk Insert when the data file is on a network share when using Windows Authentication. What is known is that the SQL Server service account has access to the network resource, which is shown by logging into SQL Server with a SQL account and doing the Bulk Insert. I also have rights to the files on the share, as shown by the fact that I put the files there. My SQL is in the form of:
Bulk Insert [table name] From '[server][share][filename]' With (FirstRow = 2, FormatFile='FormatFile.xml')
Now, when connecting to SQL Server with Windows Authentication and running the Bulk Insert I get the following error:
Msg 4861, Level 16, State 1, Line 2 Cannot bulk load because the file "[server][share][filename]" could not be opened. Operating system error code 5(Access is denied.).
I found this snip at
BULK INSERT (Transact-SQL)Security Account Delegation (Impersonation), which says, in part (emphasis mine):
To resolve this error [4861], use SQL Server Authentication and specify a SQL Server login that uses the security profile of the SQL Server process account, or configure Windows to enable security account delegation. For information about how to enable a user account to be trusted for delegation.
How to Configure the Server to be Trusted for Delegation, and we tried the unconstrained delegation and I rebooted the SQL server, but it still does not work. Later we tried constrained delegation and it still does not work.
I have verified the SPNs:
C:>setspn adsvc_sqlRegistered ServicePrincipalNames for CN=SVC_SQL,OU=Service Accounts,OU=Users,OU=ad domain,DC=ad,DC=local: MSSQLSvc/SQLQA.ad.local:1433 MSSQLSvc/SQLDev.ad.local:1433 MSSQLSvc/SQLQA.ad.local MSSQLSvc/SQLDev.ad.local
I have verified that my SQL connection is TCP and I am getting/using a Kerberos security token.
C:>sqlcmd -S tcp:SQLQA.ad.local,1433 -E1> Select dec.net_transport, dec.auth_scheme From sys.dm_exec_connections As dec Where session_id = @@Spid;2>
gonet_transport auth_scheme------------- -----------TCP KERBEROS(1 rows affected)1>
If I move the source file to a local drive (on the SQL server), all works fine, but I must be able to read from a file share?
View 8 Replies
View Related
Aug 4, 2015
Is there a way to bulk remove spaces from column names from all tables in a db?
View 6 Replies
View Related
Sep 6, 2006
I have no problem importing a file.txt to my table (mehet).
Bulk Insert mehet From 'C: est.txt'
With (DataFileType = 'char', FIELDTERMINATOR = ',')
But I would appreciated if someone could help me how to import only 1 or 2 columns.
instead of all columns.
Thanks.
juvan
View 5 Replies
View Related
Sep 13, 2007
Hi,
I have a question on inserting data for only a specific column in a table.
I have a table as follows
Table <MyTable>
{
Name varchar,
DateUpdate DateTime
}
I wanted to insert the the from a file to table.
The File contains the list of name as follows (line by line)
name1
name2
name3
name4
......
The file name actually contains the DateTime
I would like to insert the names in the file as wellas the DateTime (i.e. part of the file name ) into <MyTable>
I guess "Bulk insert " doens't allow to insert values for only one column
If i change the contents of my data file to
name1 | DateTime1
name2 | DateTime2
name3 | DateTime3
name4 | DateTime4
Then the follwoin query works fine for me.
Bulk Insert <MyTable> FROM <filePath>
With
{
FIELDTERMINATOR = '|'
ROWTERMINATOR = ''
}
But my original file will contains only Names and the file name contains the date that commom for all the names in the file. And also the file may contains millions of names
Is there any way this can be accomplised using " Bulk Insert" ? Or is there any alternative that i can do it fastly
Your answer will be appreciated
~mohan
View 6 Replies
View Related
Apr 16, 2008
Can anyone help me how to use Bulk Insert and Export Column in ssis
View 1 Replies
View Related
Jul 20, 2005
I am trying to bulk insert a text file. The file has fixed-length fieldswith no field terminators. BOL says that field terminators are onlyneeded when the data does *not* contain fixed-length fields, whichimplies they are optional -- so I made a format file without any (twoconsecutive tabs with nothing between them). The following messageresulted:Server: Msg 4827, Level 16, State 1, Line 1Could not bulk insert. Invalid column terminator for column number1 in format fileThat sounds like I am required to have some sort of terminator in theformat file, even though there aren't any in the data file. Unfortunately,the documentation on bcp/bulk copy and format files does not directlyaddress this point, and I would appreciate some help.BTW, putting '""' (empty string) for the terminator also leads to errors,with the first field overflowing -- bulk insert can't figure out whereit ends.Thanks,Jim GeissmanCountrywide Home Loans
View 3 Replies
View Related
Feb 3, 2010
we can easily load a file into db tables. However, my main concern here is the number of columns in the file. A text file TEXT_1400.txt has 1400 columns. I am unable to load data to my db table using BCP or BULK INSERT commands, as maximum of 1024 columns are allowed per table in SQL Server 2008.
We can still go ahead and create ‘Wide Table’ (a special table that holds up to 30,000 columns. The maximum size of a wide table row is 8,019 bytes.). But when operating on wide table, BCP/BULK INSERT commands still fail. After few hours of scratching my head over BCP and BULK INSERT, I observed that while inserting BCP/BULK INSERT commands are unable to identify SPARSE columns and skip these columns, which disturbs column mapping and results in data conversion and trancation errors.
Is there any proper way to load this kind of files into the db table?
View 6 Replies
View Related
Jan 30, 2004
Hi,
I have a text file with a single column that i need to bulk insert into a table with 2 colums - an ID (with identity turned on) and col2
my text file looks like:
row1
row2
row3
...
row10
so my bulk insert i have like this:
BULK INSERT test FROM 'd: estBig.txt'
WITH (
DATAFILETYPE = 'char',
FIELDTERMINATOR = ',',
ROWTERMINATOR = '
'
)
but i get the error:
Server: Msg 4866, Level 17, State 66, Line 1
Bulk Insert fails. Column is too long in the data file for row 1, column 1. Make sure the field terminator and row terminator are specified correctly.
However, as you can see from the text file, there is only one column, so i dont have any field terminators.
Any ideas how to make this work?
Thanks.
View 4 Replies
View Related
Sep 2, 2006
I have a file I'm trying to do some non-set-based processing with. Inorder to make sure I keep the order of the results, I want to BULKINSERT into a temp table with an identity column. The spec says thatyou should be able to use either KEEPIDENTITY or KEEPNULLS, but I can'tget it to work. For once, I have full code - just add any file of yourchoice that doesn't have commas/tabs. :)Any suggestions, folks?--create table ##Holding_Tank ( full_record varchar(500)) -- thisworkscreate table ##Holding_Tank (id int identity(1,1) primary key,full_record varchar(500)) --that doesn't workBULK INSERT ##Holding_TankFROM "d: elnet_scriptspsaxresult.txt"WITH(TABLOCK,KEEPIDENTITY,KEEPNULLS,MAXERRORS = 0)select * from ##Holding_tank
View 2 Replies
View Related
Sep 29, 2007
Hi,
I have a data file in the folloing format
SubjectId1|class1
SubjectId2|class2
SubjectId3|class3
I just wanted to insert only SubjectIds into my table 'Subjects' which has the follwing schama ignoring the classes
The row delimeter is "
" and the column delimeter is ' | '
Table Subjects
{
ID (Autoincrement)
SubjectId varchar(20)
}
Can any one provide the format file for doing this or suggest anyway to do this?
Please do note that the file may contain millions of records
Thank u
~mohan
View 5 Replies
View Related
Jun 27, 2006
Hi,
for testing purposes I'm inserting a flat file into a sql-server table using BULK INSERT unsig the following code:
BULK INSERT rsk_staging
FROM 'c: empulk
sk.txt'
WITH (
FIELDTERMINATOR = '',
ROWTERMINATOR = '
',
CODEPAGE = 'RAW',
DATAFILETYPE = 'char',
BATCHSIZE = 100000,
ROWS_PER_BATCH = 1925604,
TABLOCK
)
I have two versions of "rsk.txt" one with 1.9mill rows and one with the first 2000 rows only. The files have one column only with 115 characters that I'll split in to several columns later using SUBSTRING. The one with 2000 rows fires in to the database with no problems whatsoever using this exact code, the other one throws the following error:
Server: Msg 4866, Level 17, State 66, Line 1
Bulk Insert fails. Column is too long in the data file for row 1, column 1. Make sure the field terminator and row terminator are specified correctly.
How can I resolve this problem?
EDIT: I tried several different row- and fieldterminators but this exact one works for the small data-file so I assume it should also work for the large one...the large one is however copyed directly using binary ftp from a unix-filesystem and the small one is manually copied into a new txt-file using UltraEdit.
View 1 Replies
View Related
Aug 21, 2007
Hi,
I have a problem with BULK INSERT. I created the following table:
Code Snippetcreate table Test
(id char(4), name nvarchar(16), last char(1))
I am trying to bulk insert data from ASCII (not unicode) file with only two rows:
0011First name
0018Second name
Since it is a fixed length file, I am using the following format file:
Code Snippet
8.0
3
1 SQLCHAR 0 4 "" 1 ID HEBREW_CI_AS
2 SQLCHAR 0 16 "" 2 NAME HEBREW_CI_AS
3 SQLCHAR 0 0 "
" 3 Last HEBREW_CI_AS
With bcp utility everything works just fine!
Code Snippet
bcp Demo.dbo.test in c: est -T -f c: est.fmt
But when I use BULK INSERT in the following form:
Code Snippet
BULK INSERT Test FROM 'c:Test'
WITH
(
FORMATFILE='c:Test.fmt',
CODEPAGE='OEM'
);
I am getting error
Server: Msg 4863, Level 16, State 1, Line 1
Bulk insert data conversion error (truncation) for row 1, column 2 (name).
Now, one interesting thing: if I change the name field from nvarchar to varchar, it is working with BULK INSERT as well.
Can anybody explain what is going on here?
I am using MS SQL 2000 and MSDE
Thanks in advance,
Eugene.
View 2 Replies
View Related
May 6, 2008
Hello All,
I'm having a problem doing a bulk insert on a tab delimited text file into mssql 2005 using either bulk insert or bcp.
When using the following bulk insert command I get the "The column is too long in the data file for row 1, column 2" error.
I have tried
Code Snippet
BULK
INSERT
test.dbo.customerdefinition
FROM
'data_file.txt'
WITH
(
FORMATFILE = 'format_file.txt',
FIELDTERMINATOR = ' ',
ROWTERMINATOR = 'n',
KEEPIDENTITY
)
The data file only has data for the first 10 columns of a table with over 100 columns.
First 10 table columns have the format of
CREATE TABLE [dbo].[CustomerDefinition](
[Rowid] [int] IDENTITY(1,1) NOT FOR REPLICATION NOT NULL,
[CustomerId] [varchar](15) COLLATE SQL_Latin1_General_CP1_CI_AS NOT NULL,
[Name] [varchar](50) COLLATE SQL_Latin1_General_CP1_CI_AS NULL,
[Addr1] [varchar](50) COLLATE SQL_Latin1_General_CP1_CI_AS NULL,
[Addr2] [varchar](50) COLLATE SQL_Latin1_General_CP1_CI_AS NULL,
[Addr3] [varchar](50) COLLATE SQL_Latin1_General_CP1_CI_AS NULL,
[City] [varchar](30) COLLATE SQL_Latin1_General_CP1_CI_AS NULL,
[State] [varchar](50) COLLATE SQL_Latin1_General_CP1_CI_AS NULL,
[Zipcode] [varchar](10) COLLATE SQL_Latin1_General_CP1_CI_AS NULL,
[Country] [varchar](50) COLLATE SQL_Latin1_General_CP1_CI_AS NULL
CONSTRAINT [PK_CustomerDefinition] PRIMARY KEY CLUSTERED
The data-file looks like this (it is tab delimited):
1 1 BEN ONE BENVENUTO 1 BENVENUTO ST. CLAIR & AVENUE ROAD TORONTO ON
2 1 BIGGIN DDP PARTNERSHIP 1 BIGGIN LTD. 1 BIGGIN COURT NORTH YORK ON
3 1 EVA MELIA CORPORATION 1 EVA ROAD SUITE #412 ETOBICOKE ON
4 1 FINANC CONCERT PROPERTIES 200 BAY STREET- SOUTH TOWER SUITE 2100- PO BOX 56 TORONTO ON
5 1 LONGBRID BERKLEY PROPERTY MANAGEMENT INC 1 LONGBRIDGE ROAD 2ND FL THORNHILL ON
6 10 DORA VILLA LASFLORES C/O FOCUS PROPERTIE 10 DORA AVENUE TORONTO- ON
7 10 HOLMES HALTON COMMUNITY HOUSING 10 HOLMESWAY PLACE ACTON ON
8 100 CANYON DEL PROPERTY MANAGEMENT 100 CANYON AVENUE BATHURST & SHEPPARD NORTH YORK ON
9 100 CEDAR LAWRENCE CONSTRUCTION 100 CEDAR AVENUE YONGE & MAJOR MACKENZIE WEST RICHMOND HILL ON
10 100 GOWAN KANCO - 100 GOWAN LTD. 100 GOWAN AVENUE PAPE & DANFORTH EAST YORK ON
The format-file looks like this:
9.0
10
1 SQLINT 0 4 " " 1 Rowid ""
2 SQLCHAR 2 15 " " 2 CustomerId SQL_Latin1_General_CP1_CI_AS
3 SQLCHAR 2 50 " " 3 Name SQL_Latin1_General_CP1_CI_AS
4 SQLCHAR 2 50 " " 4 Addr1 SQL_Latin1_General_CP1_CI_AS
5 SQLCHAR 2 50 " " 5 Addr2 SQL_Latin1_General_CP1_CI_AS
6 SQLCHAR 2 50 " " 6 Addr3 SQL_Latin1_General_CP1_CI_AS
7 SQLCHAR 2 30 " " 7 City SQL_Latin1_General_CP1_CI_AS
8 SQLCHAR 2 50 " " 8 State SQL_Latin1_General_CP1_CI_AS
9 SQLCHAR 2 10 " " 9 Zipcode SQL_Latin1_General_CP1_CI_AS
10 SQLCHAR 2 50 " " 10 Country SQL_Latin1_General_CP1_CI_AS
I can't for the life of me understand what I'm doing wrong but if someone can help me out here it would be greatly appreciated.
Thanks,
View 3 Replies
View Related
Apr 24, 2008
I am currently using openquery to insert data into a SQL 2000 database from a Lotus Notes database. The Lotus database is a linked server with a datasource named CLE_CARS_SF. My SQL table is called Webcases.
The query below works well because the table's columns are even in both databases:
Insert into Webcases select * from openquery(CLE_CARS_SF,
'Select * from Web_Cases')
I am moving this over to SQL 2005. The query works well, but I want to add a column to the Webcases SQL database and manually insert a value along with the openquery values.
My insert statement above no longer works because the column numbers don't match.
In a nutshell I would like a way to combine the following queries:
Insert into Webcases select * from openquery(CLE_CARS_SF,
'Select * from Web_Cases')
Insert into Webcases (insurancetype) Values ('SF')
--insurancetype is the new column.
View 9 Replies
View Related
Mar 18, 2015
Can we bulk insert only the desired column from a flat file to a table?
I am using SSIS to bulk insert from a file with more than 200 columns. I am trying to find a way I can bulk insert them to multiples table through SSIS.
The one way I can think is pre map the columns from the file to the destination tables. Build numerous Bulk Insert tasks to achieve that. But not sure if SSIS will let me do that.
View 4 Replies
View Related
Jun 3, 2015
I am using SQL Server Data Tools for Visual Studio 2012. I have a very simple SSIS package with a Data Flow task that exports from an OLE DB Source to a tab-delimited unicode Flat File Destination and a Bulk Insert task that loads from the file. Both the Flat File Destination and Bulk Import are using the same code page. The Bulk Insert task is using the wide char format to read from the file. The process works fine with nvarchar and int columns, but when I add a unique identifier column it fails with "type mismatch or invalid character for the specified code page".
View 5 Replies
View Related
Sep 28, 2015
I have two employee tables called EmpA and EmpB.Each table has the same attributes of Employee ID and Email address.I do an inner join on email address like this:
select * from EmpA
inner join EmpB on EmpA.email = EmpB.email
where EmpB like '%@mydomain.com'
I now want to modify the above where I want to output rows such that
EmpA.employeeid <> EmpB.employeeid
View 9 Replies
View Related
Nov 2, 2007
Does anyone know how to do a bulk insert using just the script task? I've been searching everyehere but can't seem to find a sample.
View 6 Replies
View Related
Jun 8, 2015
How to INSERT new column(Field) into MyTable
1. FirstName nvarchar(50)
2. LastName nvarchar(100)
3. Student bit Checked
4. CreateDate datetime
5. HwoOld float Checked
View 6 Replies
View Related
Mar 10, 2006
Part of the code as follows:
SELECT
ARDoc."Cpnyid", ARDoc."Custid", ARDoc."CuryOrigDocAmt", ARDoc."DocBal", ARDoc."DocDate", ARDoc."Doctype", ARDoc."slsperid", ARDoc."Territory", ARDoc."RecordType", ARDoc."user7",
ARTran."CmmnPct", ARTran."CuryTranAmt", ARTran."DrCr", ARTran."ExtCost", ARTran."InvtId", ARTran."JrnlType", ARTran."Qty", ARTran."RefNbr", ARTran."Rlsed", ARTran."S4Future04", ARTran."S4Future05", ARTran."TranAmt", ARTran."TranClass", ARTran."TranDate", ARTran."TranType", ARTran."UnitDesc", ARTran."UnitPrice",
RptCompany."CpnyName", RptCompany."RI_ID",
Customer."Name",
Salesperson."CmmnPct", Salesperson."Name", Salesperson."SlsperId"
FROM
{ oj ((("SOLUSBS02APP"."dbo"."zARDoc_Comm" ARDoc INNER JOIN "SOLUSBS02APP"."dbo"."RptCompany" RptCompany ON
ARDoc."Cpnyid" = RptCompany."CpnyID")
INNER JOIN "SOLUSBS02APP"."dbo"."Customer" Customer ON
ARDoc."Custid" = Customer."CustId")
LEFT OUTER JOIN "SOLUSBS02APP"."dbo"."Salesperson" Salesperson ON
ARDoc."slsperid" = Salesperson."SlsperId")
LEFT OUTER JOIN "SOLUSBS02APP"."dbo"."ARTran" ARTran ON
ARDoc."Custid" = ARTran."CustId" AND
ARDoc."Refnbr" = ARTran."RefNbr" AND
ARDoc."Doctype" = ARTran."TranType"}
Currently, if a new rep takes over for an old ones invoices and accounts...he will also get credit on the report which this query is for. Instead I need to use a table SOShipHeader to be 'date sensitive'. SOShipHeader will have the correct 'SlsperID', but will still need to pull the name from Salesperson."Name"
My guess, would be that I need to wedge the SOShipHeader table between the ARDoc and Salesperson tables?
View 5 Replies
View Related
May 28, 2015
The reason why cust_id started at #4 and not #1 is because I failed to insert property three times in a row for having "Tatoine" instead of "WI" or a state less than 5chars nchar(5) correct? Then when I did a valid statement, the row was created at the starting number of four. I imagine this prevents users from having duplicate cust_ids. This however is also where rollback and similar commands could be handy correct or is there something more obvious I'm missing on a failed "insert into" to not increment the cust_id. The three rows 1,2 and 3 do not exist I believe and are not null. Having null values would of contradicted the table where two columns "not null" are a requirement.
CREATE TABLE customersnew
(
cust_idINTNOT NULL IDENTITY(1,1),
cust_nameNCHAR(50)NOT NULL,
cust_addressNCHAR(50)NULL ,
cust_cityNCHAR(50)NULL ,
cust_stateNCHAR(5)NULL ,
[code]...
View 2 Replies
View Related
May 29, 2015
I have table with about 10000 rows, there is a column named Raw_XMLData is defined as varchar but data is xml format.
I try to insert into another table, if Raw_XMLData column has is valid xml data?
Is it possible to do in T sql?
View 2 Replies
View Related