Help With Denormalized Data
Dec 13, 2006
My ERP system (SAP Business One) does not have a "payment type" field, instead it has a field for the amount of each type of payment. It makes for annoying reporting.
If I have the following data in the Incoming Payments table (ORCT)
DocEntryCashSumCreditSumCheckSumTrsfrSumDocTotal
0100000100
1010000100
2001000100
3000100100
What is an efficient SQL query to return something like this? (It is okay if the type returns an integer code instead of a string description):
DocEntryTypeTotal
0Cash100
1Credit100
2Check100
3Transfer100
I am currently using:
SELECTT0.DocEntry,
CASE
WHEN T0.CashSum <> 0 THEN 'Cash'
WHEN T0.CreditSum <> 0 THEN 'Credit'
WHEN T0.CheckSum <> 0 THEN 'Check'
WHEN T0.TrsfrSum <> 0 THEN 'Transfer'
END AS [Type],
CASE
WHEN T0.CashSum <> 0 THEN T0.CashSum
WHEN T0.CreditSum <> 0 THEN T0.CreditSum
WHEN T0.CheckSum <> 0 THEN T0.CheckSum
WHEN T0.TrsfrSum <> 0 THEN T0.TrsfrSum
END AS [Total]
FROMORCT T0
I think I've just been staring at this too long and wanted some fresh eyes to think outside of the box on this one.
I am using this to report on our customer's monthly statements so that I can display "12/13/06 Wire Transfer $100". There are many more complications to this query such as currencies and which document number or reference to display, but I have figured out most of those on my own.
Thank you,
Mike
View 2 Replies
ADVERTISEMENT
Mar 30, 2008
I have just inherited a new project consisting of data imported into sql 2005 from a multi-dimensional database. After finding the correct ODBC and importing the data I believed that I was done, but after reviewing the resulting structure I discovered why this was called a €œmulti-dimensional€? database. The resulting imported data is completely de-normalized and resembles an excel spreadsheet more than a relational database. An example of this structure is the persons table. The table has multiple columns, some of which contain the multi-dimensional fields. These fields contain multiple values which are separated with a tilde, €œ~€?. I need to parse out the data and normalize it. In the specific sample of data below I attempting to take the personid, associates, and assocattrib and insert them into a sql table, associates. This table references the persons table where the personid and the associates references the personid in the persons table.
Code Snippet
CREATE TABLE [dbo].[persons](
[namepkey] [int] PRIMARY KEY NOT NULL,
[nameid] [varchar](6) NOT NULL,
[lastname] [varchar](41) NULL,
[firstname] [varchar](50) NULL,
[mname] [varchar](50) NULL,
[sex] [char](1) NULL,
[race] [varchar](55) NULL,
[dob] [varchar](10) NULL,
[address] [varchar](28) NULL,
[city] [varchar](32) NULL,
[state] [varchar](25) NULL,
[zip] [varchar](127) NULL,
[hphone] [varchar](10) NULL,
[busphone] [varchar](50) NULL,
[profession] [varchar](28) NULL,
[employer] [varchar](42) NULL,
[eyecolor] [varchar](23) NULL,
[build] [varchar](14) NULL,
[complexion] [varchar](26) NULL,
[haircolor] [varchar](26) NULL,
[dlnumber] [varchar](36) NULL,
[dlstate] [varchar](27) NULL,
[jacketnumber] [varchar](130) NULL,
[height] [varchar](4) NULL,
[weight] [varchar](50) NULL,
[ethnicity] [varchar](25) NULL,
)
CREATE TABLE [dbo].[associates](
[associd] [int] NOT NULL REFERENCES persons(personid),
[namepkey] [int] NOT NULL REFERENCES persons(personid),
[assocattribute] [varchar](20) NULL
)
The purpose of normalizing this data will be to show the realationship(s) between people as it has been documented in the previous data structure, i.e. person 1 is an associate of person 336 and the attribute is WIT.
My problem lies in attempting to parse out the associates and assocattrib columns and relate them to the appropriate personid. The personid relates to each associate and assocattrib and the tilde, ~, separates the values ordinal position which, in sql, would be separate rows. For example the single row:
personid associates assocattrib
58201 252427~252427~252427 VICT/SUSP~WIT~RP
Should be:
58201 252427 VICT/SUSP
58201 252427 WIT
58201 252427 RP
The imported data can have no associates:
personid associates assocattrib
152683 NULL NULL
or up to 69 associates, I am not even going to try to paste all of that here.
There are over 400,000 rows that I am looking at parsing, so any way of doing this in t-sql or the clr would be GREAT. This data is stored in SQL 2005 standard SP2. The specific machine is our test/reporting server, so I am not necessarily concerned with the best performing solution, I am leaning more towards providing some free time for me.
Any help or suggestions, including better ideas on the table structure, would be greatly appreciated.
View 3 Replies
View Related
Apr 14, 2015
I have a requirement to provide data in a denormalized form from normalized tables. Working in SSIS.
I have two tables: EmployeeCountry and Country.
EmployeeCountry
EmployeeId (PK)(FK)
CountryId(PK)(FK)
Country
CountryId (PK)
CountryName
There will only be a max of 3 Country entries for each Employee. So I want to select the EmployeeId and get the three CountryIds so it would look like this:
Employee
EmployeeId
CountryId1
CountryId2
CountryId3
View 9 Replies
View Related
Aug 12, 2015
I have a de-normalized table that I need to export to XML using For XML, but put all of the related rows under the same node.The table is alot more complicated than the example below, but for proof of concept purposes, i'll keep it really simple:
Campaign, Price
C1, 4.00
C1, 6.00
C1, 10.00
C2, 1.00
C2, 13.00
C3, 20.00
If I have a table of campaigns and prices, I would like to output it as XML like the following:
<Campaign name="C1">
<Price value="4.00"></Price>
<Price value="6.00"></Price>
<Price value="10.00"></Price>
</Campaign>
<Campaign name="C2">
<Price value="1.00"></Price>
etc
The default behaviour gives me this:
<Campaign name="C1" price="4.00"/>
<Campaign name="C1" price="6.00"/>
<Campaign name="C1" price="10.00"/>
Is there an existing option that deals with this automatically, or do I essentially need to do a group by to output the campaign element, and then union an ungrouped select to output the price element?
View 4 Replies
View Related
Mar 6, 2015
I inherited a table with this structure:
Value a Value b
x date a
x date b
x date c
y date d
z date e
z date e
z date f
Value a fields are one to many. The objective is to obtain the maximum date value for each unique a value.
View 2 Replies
View Related
Dec 1, 2005
Hello,I am currently working on a monthly load process with a datamart. Ioriginally designed the tables in a normalized fashion with the ideathat I would denormalize as needed once I got an idea of what theperformance was like. There were some performance problems, so thedecision was made to denormalize. Now the users are happy with thequery performance, but loading the tables is much more difficult.Specifically...There were two main tables, a header table and a line item table. Thesehave been combined into one table. For my loads I still receive them asseparate files though. The problem is that I might receive a line itemfor a header that began two months ago. When this happens I don't get aheader record in the current month's file - I just get the record inthe line items file. So now I have to join the header and line itemtables in my staging database to get the denormalized rows, but I alsomay have to get header information from my main denormalized table(~150 million rows). For simplicity I will only include the primarykeys and one other column to represent the rest of the row below. Thetables are actually very wide.Staging database:CREATE TABLE dbo.Claims (CLM_ID BIGINT NOT NULL,CLM_DATA VARCHAR(100) NULL )CREATE TABLE dbo.Claim_Lines (CLM_ID BIGINT NOT NULL,LN_NO SMALLINT NOT NULL,CLM_LN_DATA VARCHAR(100) NULL )Target database:CREATE TABLE dbo.Target (CLM_ID BIGINT NOT NULL,LN_NO SMALLINT NOT NULL,CLM_DATA VARCHAR(100) NULL,CLM_LN_DATA VARCHAR(100) NULL )I can either pull back all of the necessary header rows from the targettable to the claims table and then do one insert using a join betweenclaims and claim lines into the target table OR I can do one insertwith a join between claims and claim lines and then a second insertwith a join between claim lines and target for those lines that weren'talready added.Some things that I've tried:INSERT INTO Staging.dbo.Claims (CLM_ID, CLM_DATA)SELECT DISTINCT T.CLM_ID, T.CLM_DATAFROM Staging.dbo.Claim_Lines CLLEFT OUTER JOIN Staging.dbo.Claims C ON C.CLM_ID = CL.CLM_IDINNER JOIN Target.dbo.Target T ON T.CLM_ID = CL.CLM_IDWHERE C.CLM_ID IS NULLINSERT INTO Staging.dbo.Claims (CLM_ID, CLM_DATA)SELECT T.CLM_ID, T.CLM_DATAFROM Staging.dbo.Claim_Lines CLLEFT OUTER JOIN Staging.dbo.Claims C ON C.CLM_ID = CL.CLM_IDINNER JOIN Target.dbo.Target T ON T.CLM_ID = CL.CLM_IDWHERE C.CLM_ID IS NULLGROUP BY T.CLM_ID, T.CLM_DATAINSERT INTO Staging.dbo.Claims (CLM_ID, CLM_DATA)SELECT DISTINCT T.CLM_ID, T.CLM_DATAFROM Target.dbo.Target TINNER JOIN (SELECT CL.CLM_IDFROM Staging.dbo.Claim_Lines CLLEFT OUTER JOIN Staging.dbo.Claims C ON C.CLM_ID =CL.CLM_IDWHERE C.CLM_ID IS NULL) SQ ON SQ.CLM_ID = T.CLM_IDI've also used EXISTS and IN in various queries. No matter which methodI use, the query plans tend to want to do a clustered index scan on thetarget table (actually a partitioned view partitioned by year). Thenumber of headers that were in the target but not the header file thismonth was about 42K out of 1M.So.... any other ideas on how I can set up a query to get the distinctheaders from the denormalized table? Right now I'm considering usingworktables if I can't figure anything else out, but I don't know ifthat will really help me much either.I'm not looking for a definitive answer here, just some ideas that Ican try.Thanks,-Tom.
View 2 Replies
View Related
Aug 27, 2002
Thanks for reading.
This is pretty long, hopefully it isn't rambling.
I'm building a system that imports data from several source, Excel files, text files, Access databases, etc. using DTS. The entire process revolved around MS SQL Server, by the way.
I figured I would create denormalized tables that mirror the Excel and flat files, for example, in structure, import data to those, clean up and remove duplicates there, then break those out into my normalized table structure later.
Now I've finished the importing part (though this is going to happen once a week) and I'm onto breaking up the denormalized tables.
I'm hesitating because I'm not sure I've made the best decisions in terms of process, etc.
I've decided to use cursors to loop over the denormalized tables and use batch insert statements to push data out to the appropriate tables.
Any comments? Suggestions? All is welcome.
I'm specifically interested in hearing back on the way I've set up the intermediate, denormalized tables and how I'm breaking them up using cursors (step 2 of the process below). Still, all comments are welcome. As are suggestions for further reading.
Thanks again...
simplified example
(my denormalized tables are 20 - 30 colums wide)
denormalized table:
===================
name, address, city, state, cellphone, homephone
normalized tables:
==================
tblPerson [PK_person, name, age, height, weight]
tblAddress [PK_address, FK_person, street, city, state, zip, addressType]
tblContact [PK_contact, FK_person, data, contactType]
I'm breaking up the denormalized tables like this (*UNTESTED*):
=================================================
DECLARE @vars.... (one for each column in my normalized table structure, matching size and type)
DECLARE myCursor CURSOR
FAST_FORWARD FOR
SELECT name, address, city, state, cellphone, homephone
FROM _DNT_myWideTable
INTO
WHILE @@Fetch_Status = 0
BEGIN
-- grab the next row from the wide table
FETCH NEXT FROM myCursor
INTO @name, @address, @city, @state, @cellphone, @homephone
-- create the person first and get the ID with @@IDENTITY
INSERT INTO tblPerson (name) VALUES (@name)
SET @personID = @@IDENTITY
-- use that ID to coordinate inserts across other tables
INSERT INTO tblAddress (FK_person, address, city, state, addressType)
VALUES(@person, @address, @city, @state, 'HOME')
INSERT INTO tblContact (FK_person, data, contactType)
VALUES(@person, @cellphone, 'CELLPHONE')
INSERT INTO tblContact (FK_person, data, contactType)
VALUES(@person, @homephone, 'HOMEPHONE')
END
View 1 Replies
View Related
Apr 17, 2006
Hello,
I am pretty new to SSIS, so please excuse me if this is a trivial question.
I have a denormalized database table in an Access database that I need to import into several different tables in a SQL 2005 database. You can think of the Access table as a CustomerOrders table. For example customer related information (i.e. CustomerName, CustomerID, etc...) is repeated with each record in the Access table. When this data gets moved to the SQL 2005 database, I need to insert one record for each distinct CustomerName/Customer ID record into a Customers table. I then need to insert and link every "Order" record into an "Orders" table.
I am sure that this is probably a pretty common task, but I have not found any examples or articles explaining this particular situation. What ways can this be done?
I was thinking I need to loop through each DISTINCT Customer record in the Access (source) table and insert a Customer record into the destination database's Customer table. I would then need to iterate through each row of the Access (source) table and "Lookup" the appropriate CustomerID/Key Field and insert an "Order" record.
The Access table contains over 75,000 rows of data. I am looking for the most appropriate way of doing this with SSIS (so that I don't have to write a custom application to do this!). Any help, input, links, articles, etc. is appreciated!!
TIA
-Brian
View 1 Replies
View Related
Nov 24, 2006
Hi, all here,
Thank you very much for your kind attention.
I am wondering if it is possible to use SSIS to sample data set to training set and test set directly to my data mining models without saving them somewhere as occupying too much space? Really need guidance for that.
Thank you very much in advance for any help.
With best regards,
Yours sincerely,
View 5 Replies
View Related
Dec 14, 2005
After testing out the application i write on the local pc. I deploy it to the webserver to test it out. I get this error.
System.Data.SqlClient.SqlException: The conversion of a char data type to a
datetime data type resulted in an out-of-range datetime value.
Notes: all pages that have this error either has a repeater or datagrid which load data when page loading.
At first I thought the problem is with the date, but then I can see
that some other pages that has datagrid ( that has a date field) work
just fine.
anyone having this problem before?? hopefully you guys can help.
Thanks,
View 4 Replies
View Related
Dec 4, 2007
I have used both data readers and data adapters(with datasets) in the projects that I have worked on. I am trying to get some clarification on when I should be using which one. I think I am doing this correctly but I want to be sure I am developing good habits.
As the name might suggest, it seems like a datareader is for only reading data. I have read that the data adapter and dataset are for a disconnected architecture. Or, that they can be used for this type of set up. I have been using the data adapter and datasets when writing to a database and the datareader when reading from a database.
Is this how these should be used? Is the data reader the best choice for reading data? Am I doing this the optimal way from a performance stand point?
......................................................thanks in advance
View 1 Replies
View Related
Nov 2, 2015
We already integrated different client data to MDS with MS Excel plugin, now we want to push back updated or new added record to source database. is it possible do using MDS? Do we have any background sync process to which automatically sync data to and from subscriber and MDS?
View 4 Replies
View Related
Oct 18, 2006
When I enter over 4000 chars in any ntext field in my SQL Server 2005 database (directly in the database and through the application) I get an error saying that the data could not be updated because string or binary data would be truncated.Has anyone ever seen this? I cannot figure out what is causing it, ntext should be able to hold a lot more data that this...
View 7 Replies
View Related
Aug 12, 2015
I have a requirement to implement CDC for 50+ tables to implement incremental data changes warehouse/reporting rather than exporting the whole table data. The largest table is having more than half a billion records.
The warehouse use a daily copy of OLTP db (daily DB refresh). How can I accomplish this. Is there a downside in implementing CDC just for the sake of taking incremental changes on the tables?
Is there any performance impact if we enable CDC on OLTP db?
Can we make use of the CDC tables on the environment we do daily db refresh so that the queries don't hit OLTP database?
What is the best way to implement CDC to take incremental changes for reporting.
View 0 Replies
View Related
Jul 20, 2005
Hi,This is driving me nuts, I have a table that stores notes regarding anoperation in an IMAGE data type field in MS SQL Server 2000.I can read and write no problem using Access using the StrConv function andI can Update the field correctly in T-SQL using:DECLARE @ptrval varbinary(16)SELECT @ptrval = TEXTPTR(BITS_data)FROM mytable_BINARY WHERE ID = 'RB215'WRITETEXT OPERATION_BINARY.BITS @ptrval 'My notes for this operation'However, I just can not seem to be able to convert back to text theinformation once it is stored using T-SQL.My selects keep returning bin data.How to do this! Thanks for your help.SD
View 1 Replies
View Related
Nov 10, 2015
I'm using Script Component to load data into Oracle DB due to the poor performance issue. Now, I found it will missing some data during the transmission. Please see the screenshot below:Â
SQL Server:
Oracle:
DDL:
create table Person
(
BusinessEntityID Integer,
FirstName nvarchar2(50),
MiddleName nvarchar2(50),
LastName nvarchar2(50)
);
Result:
I follow up this article:Â [URL] ....
VB Script:Â
Imports System
Imports System.Data
Imports System.Math
Imports Microsoft.SqlServer.Dts.Pipeline.Wrapper
Imports Microsoft.SqlServer.Dts.Runtime.Wrapper
[Code] ..........
View 8 Replies
View Related
Apr 16, 2008
Hi all, i got this error:
[DTS.Pipeline] Error: "component "Excel Source" (1)" failed validation and returned validation status "VS_NEEDSNEWMETADATA".
and also this:
[Excel Source [1]] Warning: The external metadata column collection is out of synchronization with the data source columns. The column "Fiscal Week" needs to be updated in the external metadata column collection. The column "Fiscal Year" needs to be updated in the external metadata column collection. The column "1st level" needs to be added to the external metadata column collection. The column "2nd level" needs to be added to the external metadata column collection. The column "3rd level" needs to be added to the external metadata column collection. The "external metadata column "1st Level" (16745)" needs to be removed from the external metadata column collection. The "external metadata column "3rd Level" (16609)" needs to be removed from the external metadata column collection. The "external metadata column "2nd Level" (16272)" needs to be removed from the external metadata column collection.
I tried going data flow->excel connection->advanced editor for excel source-> input and output properties and tried to refresh the columns affected.
It seems that somehow the 3 columns are not read in from the source file?
ans alslo fiscal year, fiscal week is not set up up properly in my data destination?
anyone faced such errors before?
Thanks
View 13 Replies
View Related
Jul 24, 2015
When I execute the below stored procedure I get the error that "Arithmetic overflow error converting expression to data type int".
USE [FileSharing]
GO
/****** Object: StoredProcedure [dbo].[xlaAFSsp_reports] Script Date: 24.07.2015 17:04:10 ******/
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
[Code] .....
Msg 8115, Level 16, State 2, Procedure xlaAFSsp_reports, Line 25
Arithmetic overflow error converting expression to data type int.
The statement has been terminated.
(1 row(s) affected)
View 10 Replies
View Related
Feb 5, 2007
is there a step by step paper to get there? here is what i need to consider. I Iwill have many customers that will need their own set of records and access pages "branded for their company" each customer will have many clients. I am hosting this application on a windows 2003 server with SQL 2005 server enterprise.
I am using windows authentication, I have created a username in windows, then i added the windows user in SQL management studio in security, granted "DB Read" and "DB write" and again under the database security tab. still from the web authentication fails. i must be nissing a step or two?
I expect to set up a username for each database as i setup new customers.
View 1 Replies
View Related
Feb 23, 2008
RE: XML Data source .. Expression? Variable? Connection? Error: unable to read the XML data.
I want my XML Data source to be an expression as i will be looping through a directory of xml files.
I don't see the expression property or the connection property??
I tried setting the XMLData property to @[User::filename], but that results in:
Information: 0x40043006 at Load XML Files, DTS.Pipeline: Prepare for Execute phase is beginning.
Error: 0xC02090D0 at Load XML Files, XML Source [108]: The component "XML Source" (108) was unable to read the XML data.
Error: 0xC0047019 at Load XML Files, DTS.Pipeline: component "XML Source" (108) failed the prepare phase and returned error code 0xC02090D0.
Information: 0x4004300B at Load XML Files, DTS.Pipeline: "component "OLE DB Destination" (341)" wrote 0 rows.
Task failed: Load XML Files
Information: 0xC002F30E at Bad, File System Task: File or directory "d:jcpxmlLoadjcp2.xml.bad" was deleted.
Warning: 0x80019002 at Package: The Execution method succeeded, but the number of errors raised (2) reached the maximum allowed (1); resulting in failure. This occurs when the number of errors reaches the number specified in MaximumErrorCount. Change the MaximumErrorCount or fix the errors.
SSIS package "Package.dtsx" finished: Failure.
The program '[3312] Package.dtsx: DTS' has exited with code 0 (0x0).
Thanks for any help or information.
View 3 Replies
View Related
Sep 28, 2015
I setup this package to import data from a Sharepoint list to a SQL Server data table. The primary key of my SQL table is mapped to the Title column of my Sharepoint list. There is a possibility that duplicate values will be entered in the Title field of the Sharepoint list. So when importing data into my table via SSIS, my package always error-out when there it comes across duplicate values. how you others have managed data integrity when importing from a Sharepoint list with the Title column being mapped to the primary key of a table.
View 4 Replies
View Related
Feb 12, 2008
Hello,
I am wondering what conversion rules apply, when a string, which contains a number, is saved to a SQL Server 2005 into a column of type decimal.
This is the code I€™m using (C++):
CString cValue = "0.75"
_variant_t vtFieldValue;
vtFieldValue = _variant_t(cValue)
pRecordSet->Fields->Item["MyColumn"]->Value = vtFieldValue;
"pRecordSet" is an ADO recordset. The database column "MyColumn" is of type "decimal(19,10)".
The most important question for me is, if the regional settings of the database server or the regional settings of the client PC are considered during the conversion from the string to the decimal value. For example in standard French regional settings the "." would not be recognized as decimal separator.
I am also wondering if the language of the database instance, in which this data is saved, is considered during this conversion or any other settings of this database instance.
So my general question is: Does anybody know exactly what rules apply during the above mentioned conversion?
Thank you for your help.
Regards,
Volker
View 2 Replies
View Related
Oct 12, 2015
I've question about how to handle structural datamodel changes in a datasource of PowerPivot. Suppose I'm developing a starmodel in SQL Server and sometimes a datatype changes or a name of a field changes in a table. It seems to me that PowerPivot handle this not gracefully as Analysis MD does (mostly). I received an error because of a wrong fieldname or even no error when a dattype changes in PowerPivot. Is this common or do I something wrong here. Does this mean that every time the datamodel changes the PowerPivot should be recreated? Or am I missing the clue here?
View 6 Replies
View Related
Jun 13, 2006
Hi everyone,
I have to extract, dayly a list of contacts on a exchange server in a table on our EDW on sql server 2005. Is it possible to get the information directly from a dataflow or i will have to developpe a script task ?
Need help desperatly !!!
View 3 Replies
View Related
Apr 22, 2015
I am getting ErrorCode 8 while loading the data from stage to model. I have checked my error view it states that "Member Code is Inactive".
Initially I have loaded same set of data in Model from MDS Stage table but then deleted with ImportType = 5 which removed all the data from the MDM model.
Now i want to load it back but its giving the Error Code 8 .. Before loading the same data i have changed the stage table Importtype to 2 and Importstatusid to 0.
View 6 Replies
View Related
Jun 19, 2007
Hello,
I have noticed that for one of my data-flows, the process is really long during the phase "the final commit data insertion has started".
To be accurate, the process is fast until it reaches this phase. It happens often when I load millions of lines.
The extraction is done from a database SQL Server 2005 to a database SQL Server 2005, on the same server (with the SQL Server native provider).
I used a SQL Server destination but I have tried with an OLE DB destination and it is the same situation.
Why the process could be so long during this phase?
There is a way to optimised my package to avoid that?
Any idea is welcome.
Thanks.
Guillaume
View 6 Replies
View Related
Nov 1, 2006
HiI'm having problems following the tutorial on creating a data access layer - http://www.asp.net/learn/dataaccess/tutorial01cs.aspx?tabid=63 - when I try to compile in Visual Studio 2005 I get namespace could not be found. I followed exactly the tutorial - I created a dataset and added this code in my aspx page. <asp:GridView ID="GridView1" runat="server" CssClass="DataWebControlStyle"> <HeaderStyle CssClass="HeaderStyle" /> <AlternatingRowStyle CssClass="AlternatingRowStyle" />In my C# file I added these lines... using NorthwindTableAdapters; <<<<<this is the problem - where does this come from? protected void Page_Load(object sender, EventArgs e) { ProductsTableAdapter productsAdapter = new ProductsTableAdapter(); GridView1.DataSource = productsAdapter.GetProducts(); GridView1.DataBind(); }Thanks in advance
View 6 Replies
View Related
Jun 11, 2007
My vendor requires data to be sent in Excel format. Some of my tables have rows over 65,536 so I need to use Excel 2007 (Max of 1,048,576). Right now my data sits in SQL 2000. I am using MS SQL Enterprise Manager 8.0 to prepare the data. Is there some kind of add on or selection I am missing to use DTS to export from SQL to Excel 2007?Thanks in advance.
View 3 Replies
View Related
May 7, 2014
I think I am definitely thrashing and am not getting anywhere on something I think should be pretty simple to accomplish: I need to pull the total amounts for compartments with different products which are under the same manifest and the same document number conditionally based on if the document types are "Starting" or "Ending" but the values come from the "Adjust" records.
So here is the DDL, sample data, and the ideal return rows
CREATE TABLE #InvLogData
(
Id BIGINT, --is actually an identity column
Manifest_Id BIGINT,
Doc_Num BIGINT,
Doc_Type CHAR(1), -- S = Starting, E = Ending, A = Adjust
Compart_Id TINYINT,
[Code] ....
I have tried a combination of the below statements but I keep coming back to not being able to actually grab the correct rows.
SELECT DISTINCT(column X)
FROM #InvLogData
GROUP BY X
HAVING COUNT(DISTINCT X) > 1
One further minor problem: I need to make this a set-based solution. This table grows by a couple hundred thousand rows a week, a co-worker suggested using a <shudder/> cursor to do the work but it would never be performant.
View 9 Replies
View Related
Jan 12, 2015
Was wondering if there was a best practice minimum permissions for creating a SQL login to use when setting up a new shared Data source for SSRS report manager?
Something along the lines of them being a data read for the DB and permissions to update tempdb?
Would have thought it not advisable to have the login be able to update the main db...
View 1 Replies
View Related
Sep 4, 2015
I need to create a function that replaces the data in a column with an 'X' based on the LEN of the data in the column. I created one that does a replacement, but it fills the column based on the max data length, and not the current length of the string or integer. An example of what I'm trying to accomplish.
Original data in a varchar(30) column:
thisisavalue
thisisanothervalue
thisisanothervalueagain
shortval
replaced with
xxxxxxxxxx
xxxxxxxxxxxxxxxx
xxxxxxxxxxxxxxxxxxxx
xxxxxxx
My current function is replacing the data like this:
xxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
xxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
xxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
xxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
View 4 Replies
View Related
Jul 20, 2005
Hello all. Before my arrival at my current employer, our consultantsphysically set up our MSSQL 7 server as follows:drive c: contains the mssql enginedrive d: contains the transaction logdrive e: contains the data filesNo filegroups were set up and the data files consist of only 1 largephysical file. Currently, our data file is >10GB. When I was trained onthe physical aspects of sqlserver, I was told to never create physical files[color=blue]> 2048MB each. If I did, I could expect inefficient physical storage of[/color]data and slower performance (due to the OS).Our server has 2 RAID-5 arrays. Drive c: and e: are located on the firstarray and drive d: on the second. We're running Windows 4.0 NT Server SP6with NTFS.Can someone comment on the use of 1 single large data file vs. more smallerdata files?
View 2 Replies
View Related
Jun 11, 2015
I am trying to find a reference for a client that lists the fields available to be substituted into a data driven subscription from the query, along with the expected data types.  For example, the field on whether or not to include a link to the report seems to be expecting a bit data type.I have searched and can't seem to find anything.  I guess I could walk through the interface and try different data types, but if  a list exists, that would be better.Â
View 4 Replies
View Related