Get A List Of Output Columns On Script Transformation
Feb 13, 2007
I am using a script component to transform data. In the script component I created a bunch of fields for the output. Is there any way to loop through that list of columns? Is there code I can use in the script component to access the names, data types, data etc?
I saw a lot of informaiton on the OutputColumnCollection as part of some IDTSOuput90 thing (greek to me). As best I can guess this is for creating your own new columns, but can I see what columns are already defined via the script interface?
View 2 Replies
ADVERTISEMENT
Jun 21, 2006
Can I add Output Columns to the Script Transformation Editor using code? I have to execute a SQL Statement to determine the number of years we have the data for for an item and then create the columns for the months in those years and populate them with the quantities. So my question is can I create output columns to the Script Transformation Editor on the fly that is as it is being executed?
Any input will be good.
Thanks,
MShah
View 3 Replies
View Related
Aug 14, 2007
I would like my transformation to automatically create an output column for each input column. Any tips? I can't seem to determine which event to listen to or method to override.
View 3 Replies
View Related
Dec 25, 2006
While using Aggregate Transformation to group one column,the rows of output sometimes larger than the rows returned by a T-SQL statement via SSMS.
For example,the output of the Aggregate Transformation may be 960216 ,but the
'Select Count(Orderid) From ... Group By ***' T-SQL Statement returns 96018*.
I'm sure the Group By of the Aggregate Transformation is right!
But ,when I set the "keyscale" property of the transformation,the results match!
In my opinion,the "keyscale" property will jsut affects the performance of the transformaiton,but not the result of the transformation.
Thanks for your advice.
View 2 Replies
View Related
Dec 8, 2006
I am working on an OLAP modeled database.
I have a Lookup Transformation that matches the natural key of a dimension member and returns the dimension key for that member (surrogate key pipeline stuff).
I am using an OLE DB Command as the Error flow of the Lookup Transformation to insert an "Inferred Member" (new row) into a dimension table if the Lookup fails.
The OLE DB Command calls a stored procedure (dbo.InsertNewDimensionMember) that inserts the new member and returns the key of the new member (using scope_identity) as an output.
What is the syntax in the SQL Command line of the OLE DB Command Transformation to set the output of the stored procedure as an Output Column?
I know that I can 1) add a second Lookup with "Enable memory restriction" on (no caching) in the Success data flow after the OLE DB Command, 2) find the newly inserted member, and 3) Union both Lookup results together, but this is a large dimension table (several million rows) and searching for the newly inserted dimension member seems excessive, especially since I have the ID I want returned as output from the stored procedure that inserted it.
Thanks in advance for any assistance you can provide.
View 9 Replies
View Related
Jul 3, 2006
Hi,
I am writing a Dataflow task which will take a Particular column from the source table and i am passing the column value in the SQL command property. My SQL Command will look like this,
Select SerialNumber From SerialNumbers Where OrderID = @OrderID
If i go and check the output column in the Input and output properties tab, I am not able to see this serial number column in the output column tree,So i cant able to access this column in the next transformation component.
Please help me.
Thanks in advance.
View 13 Replies
View Related
Jun 16, 2006
I am developing a custom destination component and I have encountered a few areas where there seems to be a lack of helpful documentation and examples.
1. I have not been able to find any information on or examples of creating custom destinations with an error output. The OLE DB Destination has an error output so I investigated the input and error output properties in the advanced editor and found that the OLE DB Destination error output is synchronous with the input (its SynchronousInputID matches the input's ID) and has its ExclusionGroup value set to 1. Using this information, I modeled my error output after the OLE DB Destination.
ProvideComponentProperties:
AddErrorOutput(ERROR_OUTPUT_NAME, input.ID, 1);
ProcessInput:
int errorOutputID = -1;
int errorOutputIndex = -1;
GetErrorOutputInfo(ref errorOutputID, ref errorOutputIndex);
...
buffer.DirectErrorRow(errorOutputID, 0, errorOutputIndex);
Checking the input and error output properties in the advanced editor for my custom destination component I find the following:
Input
-----
ID: 3515
Error Output
------------
ExclusionGroup: 1
ID: 3516
IsErrorOut: True
SynchronousInputID: 3515
Shortly after I start my SSIS package and it encounters an error row, I get the following exception:
[My Destination Adapter 1 [3512]] Error: System.ArgumentException: Value does not fall within the expected range. at Microsoft.SqlServer.Dts.Pipeline.Wrapper.IDTSBuffer90.DirectErrorRow(Int32 hRow, Int32 lOutputID, Int32 lErrorCode, Int32 lErrorColumn) at Microsoft.SqlServer.Dts.Pipeline.PipelineBuffer.DirectErrorRow(Int32 outputID, Int32 errorCode, Int32 errorColumn) at MyDestination.ProcessInput(Int32 inputID, PipelineBuffer buffer) at Microsoft.SqlServer.Dts.Pipeline.ManagedComponentHost.HostProcessInput(IDTSManagedComponentWrapper90 wrapper, Int32 inputID, IDTSBuffer90 pDTSBuffer, IntPtr bufferWirePacket)
2. My custom destination component is used for writing a file with a fixed schema. I followed the means by which source component examples add their output columns, but applied this to my external metadata columns. In my Validate() I check if the ExternalMetadataColumnCollection.Count == 0 and return DTSValidationStatus.VS_NEEDSNEWMETADATA; to force a call to ReinitializeMetaData(). In ReinitializeMetaData() I call a method that creates the input's external metadata columns that reflect my external data source.
This works fine except every time I add my custom destination component to a SSIS Package and go to edit the component I am greeted with a dialog box that states: "The component is not in a valid state. ... Do you want the component to fix these errors automatically?" Pressing the Yes button, I assume, makes the call to ReinitializeMetaData() and I have my external metadata columns. Where is the correct place to add the external metadata columns so the user does not have to take this extra step every time they add my component to their package?
View 5 Replies
View Related
Feb 13, 2008
Hi,
I have two tables,
Table A on Server 1 (3 ROWS)
ID Name Address
ID1 A B
ID2 X Y
ID3 M N
There is another table on a different server which looks like
Table B on Server 2
PKColumn ID Details
1 ID1 Desc1
2 ID1 Desc2
3 ID1 Desc 3
4 ID2 Desc
5 ID2 Description
As you can see the ID is the common column for these two tables,
I want to get the Query the above 2 tables and the output should be dumped into a new table on Server2.
I am using the following SSIS Package
OledbDataSource-------> OledbCommand(Select * from TableB where ID =?)
From here, how can insert the rows returned from the oledb command into another table.
Since, for each row of TableA it will return some output rows...How can I insert all these into the New Table.
Please help on configuring the output of the oledb command.
Thanks,
View 5 Replies
View Related
Feb 17, 2008
is it true that I will not be able to use the returned value from an sp that is called on every row from an oledb command transformation? I see all kinds of complaints on the web but cant determine if this would be a waste of time. I'd like to append the returned value (which is calculated and cannot be joined in the buffer) to the data on its way out of the transformation.
View 3 Replies
View Related
Aug 4, 2009
I am using SSIS in SQL Server Enterprise 2005. I have two OLE DB data sources from two disparate databases (IBM DB2 and Microsoft SQL Server), some columns from each of which are to be included in the merged output results. I have noted the various requirements in the forum postings with regard to sorting the OLE DB sources and specifying the output source columns as being sorted, as well as the requirement that the join fields in the two sources be close/exact matches. Yet, when I run this in VS, while the work area reflects the expected number of rows being input into the Merge Join transformation, no count is reflected as output from that transformation into the final destination table.Specifically, my two data sources (IBM DB2 and MS SQL) are configured as follows:
IBM DB2 contains an SQL statement that uses Cast operations to create the result columns.and an ORDER BY clause to ensure that the output is sorted by the desired two columns.. The OLE DB source property setting for IsSorted is set to true; the Output Columns folder column definitions for "key_ source_dtsy" and "key_source_dtrt" have their SortKeyPosition properties set to 1 and 2, respectively. Those field are both defined as data type DT_STR, with lengths of 4 and 2, respectively. Below is the Path metadata from the Data Flow Path editor from the path from this source:
IBM DB2 source"Name" "Data Type" "Precision" "Scale" "Length" "Code Page" "Sort Key Position" "Comparison Flags" "Source
Component""ID_CODE" "DT_STR" "0" "0" "10" "1252" "0" "" "Source F0005 User Defined Codes""CODE_DESCR_1" "DT_STR" "0" "0" "30" "1252" "0" "" "Source F0005 User Defined Codes""CODE_DESCR_2" "DT_STR" "0" "0" "30" "1252" "0" "" "Source F0005 User Defined Codes""key_source_dtsy" "DT_STR" "0" "0" "4" "1252" "1" "" "Source F0005 User Defined Codes""key_source_dtrt" "DT_STR" "0" "0" "2" "1252" "2" "" "Source F0005
User Defined Codes:
MS SQL contains an SQL statement that takes the columns as they are in the MS SQL table (no Cast operations needed); it also uses an ORDER BY clause to ensure the output is sorted by the join columns. The OLE DB source property setting for IsSorted is set to true; the Output Columns folder columns for "key_source_dtsy" and "key_source_dtrt" have their SortKeyPosition properties set to 1 and 2, respectively. Those field are both defined as data type DT_STR, with lengths of 4 and 2, respectively. Below is the Path metadata from the Data Flow Path editor from the path from this source:
MS SQL source"Name" "Data Type" "Precision" "Scale" "Length" "Code Page" "Sort Key Position" "Comparison Flags" "Source Component""id_code_name" "DT_I2" "0" "0" "0" "0" "0" "" "Source CodeName in db dwVdFY""key_source_dtsy" "DT_STR" "0" "0" "4" "1252" "1" "" "Source CodeName in db dwVdFY""key_source_dtrt" "DT_STR" "0" "0" "2" "1252" "2" "" "Source CodeName in db dwVdFY"
The Merge Join transformation specifies an INNER JOIN using the columns named "key_source_dtsy" and "key_source_dtrt" from the respective data sources.I know there are alternative ways of accomplishing my intent (Lookup, port MS SQL table to IBM DB2 so join can occur in SELECT statement, etc.; however, I'd like to use this functionality and assume that it should work.
View 13 Replies
View Related
Aug 30, 2006
I've built a simple custom data flow transformation component following the Hands On Lab (http://www.microsoft.com/downloads/details.aspx?familyid=1C2A7DD2-3EC3-4641-9407-A5A337BEA7D3&displaylang=en) and the Books Online (ms-help://MS.MSDNQTR.v80.en/MS.MSDN.v80/MS.SQL.v2005.en/dtsref9/html/adc70cc5-f79c-4bb6-8387-f0f2cdfaad11.htm and ms-help://MS.MSDNQTR.v80.en/MS.MSDN.v80/MS.SQL.v2005.en/dtsref9/html/b694d21f-9919-402d-9192-666c6449b0b7.htm).
All it is supposed to do is create an output column and set its value to the result of calling a web service method (the transformation is synchronous). Everything seems fine, but when I run the data flow task that contains it, it doesn't generate any output. The Visual Studio debugger displays it as yellow, with 1,385 rows going into it, but the data viewer attached to its output is empty. The output metadata looks just like I expect: all of my input columns plus the new column, correctly typed. No validation or run-time warnings or errors are reported.
I'll include the entire C# file below, which only overrrides the ProvideComponentProperties, Validate, PreExecute, ProcessInput, and PostExecute methods of the parent PipelineComponent class.
Since this is effectively a specialization of the DerivedColumn transformation, could I inherit from the class that implements the DC component instead of PipelineComponent? How do I even find out what that class is?
Thanks! Here's the code:
using System;
// using System.Collections.Generic;
// using System.Text;
using Microsoft.SqlServer.Dts.Pipeline;
using Microsoft.SqlServer.Dts.Pipeline.Wrapper;
using Microsoft.SqlServer.Dts.Runtime.Wrapper;
namespace CustomComponents
{
[DtsPipelineComponent(DisplayName = "GID", ComponentType = ComponentType.Transform)]
public class GidComponent : PipelineComponent
{
///
/// Column indexes for faster processing.
///
private int[] inputColumnBufferIndex;
private int outputColumnBufferIndex;
///
/// The GID web service.
///
private GID.WS_PDF.PDFProcessService gidService = null;
///
/// Called to initialize/reset the component.
///
public override void ProvideComponentProperties()
{
base.ProvideComponentProperties();
// Remove any existing metadata:
base.RemoveAllInputsOutputsAndCustomProperties();
// Create the input and the output:
IDTSInput90 input = this.ComponentMetaData.InputCollection.New();
input.Name = "Input";
IDTSOutput90 output = this.ComponentMetaData.OutputCollection.New();
output.Name = "Output";
// The output is synchronous with the input:
output.SynchronousInputID = input.ID;
// Create the GID output column (16-character Unicode string):
IDTSOutputColumn90 outputColumn = output.OutputColumnCollection.New();
outputColumn.Name = "GID";
outputColumn.SetDataTypeProperties(Microsoft.SqlServer.Dts.Runtime.Wrapper.DataType.DT_WSTR, 16, 0, 0, 0);
}
///
/// Only 1 input and 1 output with 1 column is supported.
///
///
public override DTSValidationStatus Validate()
{
bool cancel = false;
DTSValidationStatus status = base.Validate();
if (status == DTSValidationStatus.VS_ISVALID)
{
// The input and output are created above and should be exactly as specified
// (unless someone manually edited the persisted XML):
if (ComponentMetaData.InputCollection.Count != 1)
{
this.ComponentMetaData.FireError(0, ComponentMetaData.Name,
"Invalid metadata: component accepts 1 Input.",
string.Empty, 0, out cancel);
status = DTSValidationStatus.VS_ISCORRUPT;
}
else if (ComponentMetaData.OutputCollection.Count != 1)
{
this.ComponentMetaData.FireError(0, ComponentMetaData.Name,
"Invalid metadata: component provides 1 Output.",
string.Empty, 0, out cancel);
status = DTSValidationStatus.VS_ISCORRUPT;
}
else if (ComponentMetaData.OutputCollection[0].OutputColumnCollection.Count != 1)
{
this.ComponentMetaData.FireError(0, ComponentMetaData.Name,
"Invalid metadata: component Output must be 1 column.",
string.Empty, 0, out cancel);
status = DTSValidationStatus.VS_ISCORRUPT;
}
// And the output column should be a Unicode string:
else if ((ComponentMetaData.OutputCollection[0].OutputColumnCollection[0].DataType != DataType.DT_WSTR) ||
(ComponentMetaData.OutputCollection[0].OutputColumnCollection[0].Length != 16))
{
ComponentMetaData.FireError(0, ComponentMetaData.Name,
"Invalid metadata: component Output column data type must be (DT_WSTR, 16).",
string.Empty, 0, out cancel);
status = DTSValidationStatus.VS_ISBROKEN;
}
}
return status;
}
///
/// Called before executing, to cache the buffer column indexes.
///
public override void PreExecute()
{
base.PreExecute();
// Get the index of each input column in the buffer:
IDTSInput90 input = ComponentMetaData.InputCollection[0];
inputColumnBufferIndex = new int[input.InputColumnCollection.Count];
for (int col = 0; col < input.InputColumnCollection.Count; col++)
{
inputColumnBufferIndex[col] = BufferManager.FindColumnByLineageID(input.Buffer, input.InputColumnCollection[col].LineageID);
}
// Get the index of the output column in the buffer:
IDTSOutput90 output = ComponentMetaData.OutputCollection[0];
outputColumnBufferIndex = BufferManager.FindColumnByLineageID(input.Buffer, output.OutputColumnCollection[0].LineageID);
// Get the GID web service:
gidService = new GID.WS_PDF.PDFProcessService();
}
///
/// Called to process the buffer:
/// Get a new GID and save it in the output column.
///
///
///
public override void ProcessInput(int inputID, PipelineBuffer buffer)
{
if (! buffer.EndOfRowset)
{
try
{
while (buffer.NextRow())
{
// Set the output column value to a new GID:
buffer.SetString(outputColumnBufferIndex, gidService.getGID());
}
}
catch (System.Exception ex)
{
bool cancel = false;
ComponentMetaData.FireError(0, ComponentMetaData.Name, ex.Message, string.Empty, 0, out cancel);
throw new Exception("Could not process input buffer.");
}
}
}
///
/// Called after executing, to clean up.
///
public override void PostExecute()
{
base.PostExecute();
// Resign from the GID service:
gidService = null;
}
}
}
View 1 Replies
View Related
Mar 17, 2008
HI,
How do I loop through all columns in a row using a script
transformation? For example if I want trim all columns.
If I want to trim one column this is a simple script:
Public Class ScriptMain
Inherits UserComponent
Public Overrides Sub MyAddressInput_ProcessInputRow(ByVal Row As
MyAddressInputBuffer)
Row.City = Trim(Row.City)
End Sub
End Class
But what if I want to do that for all columns? I don't want to name
them all like this:
Public Class ScriptMain
Inherits UserComponent
Public Overrides Sub MyAddressInput_ProcessInputRow(ByVal Row As
MyAddressInputBuffer)
Row.Column1 = Trim(Row.Column1)
Row.Column2 = Trim(Row.Column2)
Row.Column3 = Trim(Row.Column3)
...
...
Row.Column997 = Trim(Row.Column997)
Row.Column998 = Trim(Row.Column998)
Row.Column999 = Trim(Row.Column999)
End Sub
End Class
Is there a simple foreach column in Row.columns option?
-- Joost (Atos Origin)
View 11 Replies
View Related
Jan 23, 2008
Is there by chance a cunning way to make the input columns automatically populate the output of an asynchronous script transformation?
My transformation writes several rows for each input row read. I'm creating some new columns along the way but I'd like all of the input columns to get output each time also. However I can't see any obvious way to achieve this, short of manually defining each column to the output and populating it in the script.
View 3 Replies
View Related
Jan 9, 2008
How would you do the following in SSIS?
SELECT a.TestID,
a.TestCode
FROM TableA a
WHERE UPPER(RTRIM(a.TestCode)) IN SELECT (SELECT UPPER(RTRIM(b.TestCode)) FROM TableB b)
Of course the above query is missing a few things but with ETL the where clause UPPER(RTRIM does not appear to be something that has an object or property that I can use in the Lookup.
Please correct and educate me.
View 4 Replies
View Related
Mar 4, 2008
Hi,
I have an example situation that seems like it should have a super easy solution, but my jobs keep failing.
Here we go. . .
I have a SQL Server 2005 table as my source in a data flow task.
This table contains raw data.
We'll call it FACT_Product_Raw - which contains a field called ProductType varchar(1)
Let's say that ProductType contains values of "A" or "B" or "C" - or for that matter, some null and garbage values
I have a lookup table, LOV_Product_Types
This table contains 3 fields that will transform my raw data table
We'll call these fields ProdTypeID smallint, ProdTypeRaw varchar(1) and ProdType smallint
It contains pairs such that A = 1, B = 2, and so on.
Here's what I want to do.
I want to ADD a field to FACT_Product_Raw that contains the "looked up" value from LOV_Product_Types.
Let's say that I want to add the ProdTypeID field to my _Raw table.
I have used the _Raw table as both my source and destination
It blows up every time.
Help.
Thanks,
David
View 5 Replies
View Related
Sep 4, 2006
The documentation on the fuzzy lookup transform mentions that only columns of type DT_WSTR and DT_STR can be used in fuzzy matching. I interpreted this as meaning that you could not create a mapping between an input column of type DT_NTEXT and a column from the reference table. I assumed that you could still have a DT_NTEXT column as part of the input and mark this as a pass through column so that it's value could be inserted in the destination, together with the result of the lookup operation. Apparently this is not the case. Validation fails with the following message: 'The data type of column 'fieldname' is not supported.' First, I'd like to confirm that this is really the case and that I have not misinterpreted this limitation.
Finally, given the following situation
- A data source with input columns
Field_A DT_STR
Field_B DT_NTEXT
- A fuzzy lookup is used to match Field_A to a row in the reference table and obtain Field_C.
- Finally, Field_B and Field_C must be inserted into the destination.
Can anyone suggest how this could be achieved?
Fernando Tubio
View 5 Replies
View Related
Dec 2, 2007
I'll try to reproduce this later, but want to report it before I forget.
I just had my package fail on a VM I was testing on. It failed because on that machine, I logged in as MachineNameAdministrator instead of using my domain account (the VM is not in the domain).
This was a problem because the "User Name" column generated by the Audit Transformation was 17 characters long! This is the length of my domain + user name on my development machine. Similarly, the machine name length was 15 characters.
I'd love to know what the "correct" sizes are for these columns. In the meantime, I'm going to set these to 255 manually, and hope the size sticks.
P.S. There was one other post on this topic, though the thread isn't clear that this was the problem: http://forums.microsoft.com/MSDN/ShowPost.aspx?PostID=472445&SiteID=1.
View 1 Replies
View Related
Apr 14, 2015
In SSIS I use the DQS Cleansing transformation component. I've got a knowledge base (KB) in place and this KB holds various domains and my data source has more input columns than would like to use for a particular clean up operation. I want to use some of the input columns to map against some domains in the KB. It is my understanding that it should be possible to select only the required input columns, but all i can do is select all input columns.
View 3 Replies
View Related
May 23, 2006
Can someone please explain the difference between Output and External columns? I can't fathom why "Output" columns aren't good enough. In other words, what is there a need or value in having two types of "output" columns?
TIA,
Barkingdong
View 5 Replies
View Related
Jul 30, 2007
I'm creating a new Integration Services Project that copies data out of a SQL 7 server, transforms it, and places the data on a SQL 2005 (SP 2) Server. When defining a lookup transformation, if I specify an OLE DB Connection to my server running SQL 7 as the reference table, as soon as I click on the Colums tab, Visual Studio closes / crashes and dumps me to windows. I don't get an error message. If however I specify a connection to a server running SQL 8, or SQL 2005, no problems.
Is this supposed to happen?
My workstation is running Windows XP Pro SP2, Visual Studio 2005 Pro.
Microsoft SQL Server Integration Services Designer
Version 9.00.1399.00
The server that doesn't work for a reference table is running Windows 2000 Server SP4
SQL 7.00.623
Thanks for your help,
Kirk
View 6 Replies
View Related
May 6, 2015
Looking at an execution plan today I noticed something I've not seen before. The plan includes a non clustered index seek, followed by a RID lookup on the heap. The output list for the index seek contains the expected "Bmk" column (in this case "Bmk1473"), but also includes "IsBaseRow1475". This isn't a column from the table.
View 3 Replies
View Related
Apr 18, 2007
Dear Folks,
Can you please tell me the query to display the columns in a table?
thank you very much
Vinod
View 3 Replies
View Related
Feb 5, 2014
how do I get value from column using list of tables?
For example, I have list:
schema_name, table_name, column_name
How do I get:
schema_name, table_name, column_name, column_value
View 7 Replies
View Related
Mar 4, 2008
Hi All.
I'm an Oracle DBA who's currently being asked to look at a SqlServer Database. I need a list of columns per table, but am having trouble.
I'll admit I might be being lazy here, but I'm in a hurry and using the valueable resources available to me!! Would really appreciate the sql i need to copy into the query window. Much obliged!!
I need........
Table A
Column1 Datatype
Column2 Datatype
Table B
Column1 Datatype
Column2 Datatype
etc....
Many thanks.
View 7 Replies
View Related
May 12, 2005
I am trying to find a way how to query following data from a table that looks like this:
ColA ColB ColC
1 2 3
Output should look like this:
ColA
1
2
3
View 2 Replies
View Related
Nov 26, 2014
The following returns all base tables within the database of type "varchar":
Code:
SELECT TABLE_SCHEMA, TABLE_NAME, COLUMN_NAME FROM mydb.information_schema.columns
WHERE TABLE_SCHEMA = 'master' AND TABLE_CATALOG = 'mydb'
AND DATA_TYPE IN('varchar')"
AND TABLE_NAME IN(
SELECT TABLE_NAME FROM mydb.information_schema.tables
WHERE TABLE_TYPE = 'BASE TABLE' AND TABLE_CATALOG = 'mydb' AND TABLE_SCHEMA = 'master')
What I then want to do is... For each of these results:
Code:
select [COLUMN_NAME] from [TABLE_SCHEMA].[TABLE_NAME]
WHERE ID = 'test'
Is it possible to do this in one SQL command? Or do I manually have to do it for each in the list from my first query?
View 3 Replies
View Related
Mar 22, 2014
I get the following error , when I execute my CTE on SQL Server 2008 R2.CTE1 has more columns than were specified in the column list
--SQL CODE IS AS BELOW
WITH CTE1 (AUCTIONID,[Open Time], [Response SLA Broken], [Response SLA Deadline], [Response SLA Actual], [Responsetime SLA Broken Before Transfer], [Resolution SLA Broken], [Resolution SLADeadline], [Resolutiontime SLA Broken
[code]...
View 2 Replies
View Related
Jul 23, 2005
Hey guys,Couldn't find this anywhere in google.I want a list of all database column names for a specific table/viewfrom across database.I tried this...-----------------------------------------------------Select *[color=blue]>From Information_Schema.Columns[/color]-----------------------------------------------------I also tried this...-----------------------------------------------------select syscolumns.name, sysobjects.name, * from syscolumns, sysobjectswheresysobjects.id = syscolumns.idand (sysobjects.xtype='U' or sysobjects.xtype='S')-----------------------------------------------------These queries return information about the CURRENT database.But, if I want to do it ACROSS database or across servers.. how can Ido this?I will express my gratitude to everyone who is kind enough to answerthis question. (I've been stuck with this problem for a while now.)Thanks!OhMyGaw!
View 6 Replies
View Related
Apr 2, 2008
Hi, I know sys.tables and sys.columns gives me a list of tables andcolumns in a SQL 2005 database.How can I list Columns in a specific Table please?Thanks in advance,Ronny
View 3 Replies
View Related
Jul 20, 2005
According to BOL, columns in an ORDER BY clause do not have to be in the SELECTcolumn list unless the SELECT includes DISTINCT, or the UNION operator.Is this a SQL Server thing, or SQL standard behavior? That is, if I were to writeabsolutely pure SQL-92, must columns in the ORDER BY clause be present in the SELECTlist?
View 1 Replies
View Related
Jul 20, 2005
Is there a way using MS SQL Server and Enterprise Manager to get a textdocument (or perhaps even a Word document) listing all table names,column names, etc of a database?--Sugapablo------------------------------------http://www.sugapablo.com <--musichttp://www.sugapablo.net <--personal
View 2 Replies
View Related
Feb 4, 2008
Hi,
I would like to get a list of all the unused columns, I have tons of files and fields in the package probably in the hundreds and I need to know which fields I am not using to see which fields are important not to miss out, so is there any way I can get them in a text file or any other format?
Thanks
View 5 Replies
View Related
Apr 2, 2002
I've got a question that I can't seem to find an answer for, I was hoping someone here might be able to point me in the right direction. I've set up a stored procedure that will email someone if any entries are added to a table . However, the output is garbled looking (see below)
Client Number SSN Client Name Old SD
New SD
------------- ----------- ---------------------------------------- --------
--------
800901 899-34-3482 John Smith 04/20/20
05/01/20
400909 144-23-0029 John Smith 04/09/20
04/11/20
447788 445-89-9967 kjl;j;j 04/05/20
04/12/20
300099 234-90-7815 John Johnson 04/08/20
04/15/20
What's happened is the client name field is too wide, so the New SD field kicks down to the next line. I'd like to clean this up. Is there a way I can either increase the length of the row before it moves to the next line, or can I re-size the client name field to match the size of the data. In other words, cli_name_vc is declared as a varchar(40). If the longest name that comes up in the query is 18 characters long, can I re-size the output so that it does not take up 40 characters?
Any help is greatly appreciated.
thanks,
-scott
View 1 Replies
View Related