I'm new to SQL Server. I installed a copy of 6.5 on my server and set it up today. I received a db from a colleague and have been unable to find out what the variable types and lengths are. This may be very easy but I need to know.
I'm trying to build a process within SSIS and running into a large number of road blocks. I could just drop out into T-SQL and code this entire thing, but I'm trying not to do that so that I can take advantage of the parallel processing pieces as well as learn a few things along the way.
Here's the overview of what I need to do on a daily basis: 1. FTP files to a local directory which I haven't already grabbed 2. Make a copy of the files into an archive directory 3. gunzip the files in my working directory 4. Remove any zero length files 5. For each file in the directory, run a data flow task 6. At the successful completion of the data import, log the imported filename into a table, along with the row count 7. If 6 is true, delete the file from the working directory
I gave up trying to get the FTP task to do #1 and I also gave up trying to get a script task written to do this. So, I wrote a stored procedure that will grab any files off an FTP site that are not already in my archive directory.
I figured out that the File System Task is incapable of handling wildcard characters to basically do "copy *.gz...". I could have done this in very little code within TSQL as well, but decided to use a for each loop to go 1 file at a time and use the file system task to copy the file.
I've already setup the for each loop with the execute process task to do the gunzip of each file in the directory. I also figured out how to suppress the pop up window as well as make it ignore any error messages. (Some of our files get an "unexpected eof" from gunzip, so that I know when we're done with the process, anything with a .gz extension had an error unzipping and needs to be kicked back to the provider.) I do have one issue with the fact that the execute process task leaves a copy of gzip.exe running in memory that leaves the last file locked and has to be killed manually. Now, I'm to the zero length files issue. There isn't anything in the file system task to let me perform operations based on a file attribute. I can do this using TSQL by shelling out through xp_cmdshell and running a dir /-C, piping that to a file, BCP the file into a table, and then string parse the file names for everything with a length of 0 and then issue a delete command. I would really rather not do that. I'm fairly certain that I can do this in WMI, but I have no idea how to code that. Essentially, what I want to do is fire off a WMI query that pulls back a list of files in a given directory with a length of zero, hand that list to a for each loop, log each zero length file name into a table within SQL Server that it was processed with 0 records, then delete the file.
Two questions: 1. Anyone know how to get the Execute Process task to not leave the last instance of the exe running when it completes? 2. Anyone know how to get a list of zero length files from a specified directory to feed to a for each loop?
I have written a stored proc that selects data from this table: A1 AA A2 BB B1 AAA B2 BBB
and puts it another table in this manner: A1 AA BB A2 AA BB B1 AAA BBB B2 AAA BBB
In short, I wanted to concatenate data in the second column.
My question is for some reason, SQL Server is limiting the length of the string to only 256 characters, even though I have defined the local variables in the storeds proc as varchar(1000) and the table has that field defined as text.
we can use 'sp_executesql' to execute any statemens. I have made a search and people, seems, need the dynamic sql only to process some table/cloumn unknown in advance. My idea is that the dynamic SQL feature is ideal for passing blocks of code (aka delegates). Particularily, you may require to execute different procedures under some acquired locks. A procedure would acquire the locks, execute the code and release the locks. The problem is, however, that I cannot find the specification for the variable length parameters. It seems not feasible for SPs. Nevertheless, the 'sp_executesql itself does accept the variable number of parameters. How? Can we look at the defenition?
i need to store quite along description in the database, which in anyother database i would choose the data type 'text', however, can someone tell me why the length is set to just 16 in sql server..... i have seen a text field with far greater than 16 chars, set at length 16, so what does this length of 16 mean????? also what is the max length of the text field?
Below is a SQL statement that shows what columns belong to a certain view. My next question is how can I get a third column that shows the column type ?
VARCHAR(100), or UNIQUEIDENTIFIER or INTEGER or TEXT or NVARCHAR(256) or ........????
select top 100 a.name, b.name, from sys.views a inner join sys.columns b on a.object_id = b.object_id where a.name like '%Case_Times%'
I would like to know the total length of data type in a table. I ran the following query. Will this give me the correct information? I also ran sp_columns <table name> and it too give the length. But There is a difference in the numbers. Am I doing something wrong and which is the correct the query or sp_column.
select sum(length) from syscolumns where id in (select id from sysobjects where name = 'XYZABC')
Hi,A query is exceeding the length of varchar and nvarchar variable.Because I'm picking the data from each record from table and giving itto the query.suggest me some way to do it.sample query:SELECT P1.*, (P1.Q1 + P1.Q2 + P1.Q3 + P1.Q4) AS YearTotalFROM (SELECT Year,SUM(CASE P.Quarter WHEN 1 THEN P.Amount ELSE 0 END) ASQ1,SUM(CASE P.Quarter WHEN 2 THEN P.Amount ELSE 0 END) ASQ2,SUM(CASE P.Quarter WHEN 3 THEN P.Amount ELSE 0 END) ASQ3,SUM(CASE P.Quarter WHEN 4 THEN P.Amount ELSE 0 END) AS Q4FROM Pivot1 AS PGROUP BY P.Year) AS P1GO---> even the P.QUARTER .... FIELD NAME IS BEING GENERATEDDYNAMICALLY.MY QUERY IS EXCEEDING VARCHAR AND NVARCHAR LIMIT.THANX IN ADV.
'The value you entered is not consistent with the data type or length of the column'
when trying to enter data into a feild, the feild type is char and the length is 100 i'm entering text 3 words long but no where near a 100 characters long any one know why this is happening?
Morning All,Can I have some help with this one please, I am having to make a fixed length text file based on information from the DBDeclare @EDIString varchar(MAX)Declare @RecordType varchar(2)Declare @RegistrationMark varchar(7)Declare @Model_Chassis varchar(11)Declare @LocationCode Varchar(4)Declare @MovementDate varchar(8)Declare @IMSAccountCode varchar(5)Declare @MovementType varchar(8)Declare @NotUsed1 Varchar(28)Declare @NotUsed2 varchar(7)Select @RecordType = RecordType, @RegistrationMark = RegistrationMark, @Model_Chassis = Model_And_Chassis, @LocationCode = LocationCode, @MovementDate = MovementDate, @IMSAccountCode = IMSAccountCode, @Movementtype = MovementTypeCode from Fiat_OutBoundOnce I have selected the information from the DB I need to ensure that each field is the correct length. I therefore want to pass the variable and the length of the variable into a function to return the correct length.So if location Code = 'AB' this needs to be four characters long so want to pass it into a function and return 'AB 'As I need to do this for 70+ variables is there an easy way to obtain the length of the collation for the variable?regardsTom
I have a requirement to import a file of rows containing fixed length data. The problem is that each row can be one of 5 different formats (i.e. different columns) -- where the "type" of row is indicated by the first two characters of the row. Each row gets inserted into its own table.
Could I use a simple Conditional Split to route the rows? Or is the split for routing similiar rows? Anyways, problems are never this simple...
In addition, each "grouping" of rows is related. The "first" row is considered the "primary" row (and gets a row id via IDENTITY, whereas the remaining rows in the group are "secondary" rows and have foreign key references back the the primary rows id.
Given (using spaces to separate columns and CrLf to show "grouping"):
So, the first 3 lines are all related to a MSFT record which needs to be spread across multiple tables. The next three lines are all related to AAPL, And the next FOUR lines (yes, each record can have zero, one, or more secondary rows) are related to CSCO.
(If this is still not clear, all the "01" rows will be written to [Table1] with each row having an IDENTITY value. All the "02" rows will be written to [Table2] the a FK pointing to the correct [Table1] row. All the "03" rows will be written to... and so on.
I am running SQLServer 2000 to parse and store records in the EDIX12 format. This consists of variable length delimited records which I am passing to the "transforms" tab to process with VBScript.
The problem is though each segment has a defined number of fields, N, the standard states that if the final M fieds are empty/blank they are not to be sent. Thus, a segment defined to have 20 fields may have 6 the first time I see it, 13 the next time, etc. To access the columns in VBScript I use DTSSource("Col001"). This works as long as the columns are there, but gives an error when they are not. Is there a parameter telling me how many columns are defined? Or is there something akin to IFEXISTS("Colxxx") or exceptions?
How can I handle this situation? One suggestion has been to pass the entire segment to the Transforms section and break it up there.
Finally, what resources can yuo point me to for reference? I'd like to get good at using DTS since my client wants their project written for it.
I have an SSIS package that creates a csv file based on a execute sql task.
The sql is incredibly simple select singlecolumn from table.
In the properties of the execute sql task I specify the result set as "full result set" when I run it I get the error that: Error:
The type of the value being assigned to variable "User::CSVoutput" differs from the current variable type.
Variables may not change type during execution. Variable types are strict, except for variables of type Object.
If I change the resultset to single row then I only get the first row from the DB (same if I choose none), If I choose XML then I get the error that the result is not xml. What resultset type do I need to choose in order to get all the rows into the CSV? The variable I am populating is of type string and is User::CSVoutput
I have this doubt and want to be sure if my thinking is correct.
Lets consider 2 tables one with Fixed length columns (char) and other table with Variable length columns (Varchar).
The table with fixed length column will always allocate same size within a Page however, table with variable length column will allocate actual length of data within a page.
I think that updates happening on table with fixed length columns will have more possibility of InPlace updates at least from data length perspective, however updates on table with variable length columns will have more split updates from data length perspective.
I am trying to extract certain data from MySQL (example sql statement is SELECT COLUMN1, COUNT(COLUMN2) FROM TABLE GROUP BY COLUMN1) and stored COLUMN1 and COUNT(COLUMN2) values into variables and then copy the values from the variables and insert them into a new table in sql server 2005 and has two columns in it (COLUMN1 of type nvarchar and COLUMN2 of type int). I can map the first output (COLUMN1) correctly to variable type String and store them in the new table in sql server 2005 (store the value in a variable type String and then use insert and store it into sql server using parameter in Execute Sql Task set the data type to NVARCHAR, but I cannot map the second output (COUNT(COLUMN2)) using the same method, I could not even get it to store into a variable of any type. Anyone have any idea how to go about it? Thanks in advance.
The output of the sql statement for MySQL should look like:
is there any way or a tool to identify if in procedure the Parameter length was declarated less than table Column length ..
I have a table
CREATE TABLE TEST001 (KeyName Varchar(100) ) a procedure CREATE PROCEDURE SpFindNames ( @KeyName VARCHAR(40) ) AS BEGIN SELECT KeyName FROM TEST001 WHERE KeyName = @KeyName END KeyName = @KeyName
Here table Column with 100 char length "KeyName" was compared with SP parameter "@KeyName" with length 40 char ..
IS there any way to find out all such usage on the ALL Procedures in the Database ?
For those of you who would like to reference my exact issue, I'm dealing with the RSExecution SSIS package at the "Update Parameters" data flow task, at the Script Component.
The script tries to split parameter data into name and value. Unfortunately, I have several reports that are passing parameters that are very large. One example has over 65,000 characters all in the normal "¶mname=value&parm2=value..." format.
The code in the script works fine until it gets to one of these very large parameter sets. I have figured out what is causing the issue. Here's some code:
Dim paramBlob as Byte() paramBlob = Row.BlobColumn.GetBlobData(0, Row.BlobColumn.Length)
The second parameter of the .GetBlobData function takes an INTEGER as its count! Therefore, no matter what kind of datatype I pass to the string that the script will later split, it will be limited to 32767 characters.
THIS IS A PROBLEM!!!
Does anyone know a workaround for this issue? I need all of the parameter data to be reported, and I would hate to have to skip over rows like this. Also, if I'm missing something, please fill me in!
Hi Forum,I have a split function.CREATE FUNCTION dbo.Split2(@List nvarchar(2000),@SplitOn nvarchar(5)) RETURNS @RtnValue table (Id int identity(1,1),Value nvarchar(100)) AS BEGINWhile (Charindex(@SplitOn,@List)>0)Begin Insert Into @RtnValue (value)Select Value = ltrim(rtrim(Substring(@List,1,Charindex(@SplitOn,@List)-1))) Set @List = Substring(@List,Charindex(@SplitOn,@List)+len(@SplitOn),len(@List))End Insert Into @RtnValue (Value) Select Value = ltrim(rtrim(@List)) ReturnEND Courtesy p2p.wrox.com It returns a TableHow to take the return into a temporary table?I want it inside an SP.Because i have a varchar field named occasionTime (varchar) in a tablea sample data inside it is "January,21,EveryWeek"In my SP i have to get either "January" and compare it to something or "21" or "EveryWeek" for comparison.How to do it?Regards,Naveen
how to pass the numeric(12,0) data type to a variable in SSIS? what kind of variable data type should I choose? I am trying to assign object_key column ( numeric(12,0)) to a variable in SSIS
If i select int32 , it keep giving me an error: Error: 0xC001F009 at Row by Row process: The type of the value being assigned to variable "User::Object_Key" differs from the current variable type. Variables may not change type during execution. Variable types are strict, except for variables of type Object.
Here's what I want to do: I've got a table with orders, each order hasa specific discountrate (an int, which represents a percentage). Eachorder consists of 1 or more items in another table, each item in thattable has a price. Now I want to return the full price and thediscounted price (or the discounted amount).Here's a relevant excerpt of the code:------------------------------------------------------------------CREATE TABLE #tmp (OrderID Integer,Price money,Discount money)DECLARE @Discount moneySELECT @Discount =(((SELECT SUM(OrderDetails.Price * OrderDetailsAmount)FROM OrderDetailsWHERE OrderID = @orderID AND CustomerID = @CustomerID)+(SELECT ISNULL(SUM(OrderDetailsSupplement.Price *OrderDetailsAmount),0)FROM OrderDetailsSupplementINNER JOIN OrderDetails ONOrderDetailsSupplement.OrderDetailsID = OrderDetails.OrderDetailsIDWHERE OrderID = @orderID AND CustomerID = @CustomerID))*( @DiscountRate / 100 ))SELECT CustomerFull,SUM(Price) As Price,SUM(Discount) As Discount,SUM (Products) As Products,COUNT(@orderID) As OrdersFROM #tmpGROUP BY CustomerFullORDER BY CustomerFull------------------------------------------------------------------The problem: instead of getting a low number (like 0.57 for instance),I get a 0. Right now I've "solved" this by replacing "( @DiscountRate /100 )" with just "@DiscountRate" and then dividing by 100 in my aspcode, but I'd really like to know what I'm doing wrong.--BVH
I€™ve got some tables with the year is part of the name, for example: TABLE2006, TABLE2007, etc.. . The year of the name of table I will read in the table INSERTED of my Trigger : I nead to create a trigger where I update those tables : SET ANSI_NULLS ON GO SET QUOTED_IDENTIFIER ON GO
CREATE TRIGGER [TESTE] ON [dbo].[TABTESTE] FOR INSERT AS DECLARE @YearTable nvarchar(4), @IdClient INT, @MyTable TABLE ( IdClient INT, Situ NVARCHAR(50) ) BEGIN SET NOCOUNT ON; SELECT @YearTable = SITUACAO, @IdClient = IdClient FROM INSERTED SET @MyTable = 'TABLE' & @YearTable UPDATE @MyTable SET Situ = 'X' WHERE IdClient = @IdClient END GO Erros: Msg 156, Level 15, State 1, Procedure TESTE, Line 9 Incorrect syntax near the keyword 'TABLE'. Msg 137, Level 15, State 1, Procedure TESTE, Line 17 Must declare the scalar variable "@MyTable". Msg 1087, Level 15, State 2, Procedure TESTE, Line 18 Must declare the table variable "@MyTable".
Is there anyway to change the data type of a variable while in an expression? My problem is I am trying to compare a variable w/ a string data type to a variable w/ an object data type. I would change the data type of the variable from object to string but if I do that my sql task fails when it tries to write a value to that variable. The variable w/ the object data type is the result of an openquery sql stmnt. So I guess there are two ways around my problem.
1. Change data type of variable while in an expression..ie flow constriant or 2. Change data type of vraiable from object to string and still get the openquery result to work.