I have an SCD that contains a historical output path. If I throw dataviewer in the flow before the SCD, I can see that it should trigger a trip down that lane, but its not. In the advanced editor, all looks good. Anything I can check. BTW, the datatype of the fields that should cause the SCD are datetime.
I have few tables that I need to export to an MS Access database each day to send off to the customer. In these tables I have a few columns that I need to strip all the HTML out of before they are put into Access since Access does not display the formatting correctly. Some of the fields are varchar and some are text, but all of the fields that are varchar are over 255 in size, so this forces me to use Access' Memo type for both.
In SQL 2000 I used the ActiveX Script to modify these fields, stripping out the HTML, and it gave me no complaints. In SQL 2005, I am adding a Script Component into the Data Flow before the Destination and adding the script in there. So far so good. The problem arises when I try to retrieve and update the data in these fields because they are treated as BLOB data. I believe what I need to do is to retrieve the Byte array from the row using the GetBlobData(), then convert that to a string, strip out the HTML, convert back to a Byte array, then clear the original value using the ResetBlobData(), and then update the field using the AddBlobData. I think?
I am not even sure that my code below is converting the byte array into a string correctly. If I throw a Messagebox in there to see what it has for the string, it just gives me the first character in the field. But even ignoring that, the package will not execute and I am stumped.
The error I now get is: Array cannot be null. Parameter name: bytes
Code Block ''----------------------------------------------------- Public Overrides Sub Input0_ProcessInputRow(ByVal Row As Input0Buffer)
Dim b As Byte() If (Row.ProjectDescription.Length > 0) And (Not (Row.ProjectDescription_IsNull)) Then b = Row.ProjectDescription.GetBlobData(0,CInt(Row.ProjectDescription.Length)) End If
Dim str As String Dim enc As New System.Text.ASCIIEncoding() str = enc.GetString(b) str = RemoveHTML(str)
b = enc.GetBytes(str)
Row.ProjectDescription.ResetBlobData()
If b.Length > 0 Then Row.ProjectDescription.AddBlobData(b) End If
End Sub ''-----------------------------------------------------
how to evaluate working day through script component in SSIS
I have a data source. I need to verify if the date belongs to working day, if it does belongs to holiday or weekend ,then I need to send that row to the out put table. I think the only way I can verify this working day issue through script component. Can any one give me some idea/thoughts?
In the code behind the Script Transformation component, neither the MsgBox nor any of the breakpoints seem to work. Am I missing to do something in order for these to get executed?
I have few tables that I need to export to an MS Access database each day to send off to the customer. In these tables I have a few columns that I need to strip all the HTML out of before they are put into Access since Access does not display the formatting correctly. Some of the fields are varchar and some are text, but all of the fields that are varchar are over 255 in size, so this forces me to use Access' Memo type for both.
In SQL 2000 I used the ActiveX Script to modify these fields, stripping out the HTML, and it gave me no complaints. In SQL 2005, I am adding a Script Component into the Data Flow before the Destination and adding the script in there. So far so good. The problem arises when I try to retrieve and update the data in these fields because they are treated as BLOB data. I believe what I need to do is to retrieve the Byte array from the row using the GetBlobData(), then convert that to a string, strip out the HTML, convert back to a Byte array, then clear the original value using the ResetBlobData(), and then update the field using the AddBlobData. I think?
I am not even sure that my code below is converting the byte array into a string correctly. If I throw a Messagebox in there to see what it has for the string, it just gives me the first character in the field. But even ignoring that, the package will not execute and I am stumped.
The error I now get is: Array cannot be null. Parameter name: bytes
''----------------------------------------------------- Public Overrides Sub Input0_ProcessInputRow(ByVal Row As Input0Buffer)
Dim b As Byte() If (Row.ProjectDescription.Length > 0) And (Not (Row.ProjectDescription_IsNull)) Then b = Row.ProjectDescription.GetBlobData(0, CInt(Row.ProjectDescription.Length)) End If
Dim str As String Dim enc As New System.Text.ASCIIEncoding() str = enc.GetString(b) str = RemoveHTML(str)
b = enc.GetBytes(str)
Row.ProjectDescription.ResetBlobData()
If b.Length > 0 Then Row.ProjectDescription.AddBlobData(b) End If
End Sub ''-----------------------------------------------------
I have a blob which is essentially an email and I need to replace all carriage returns in the blob with a special character ("€°"). But it does not work. Please advice.
Dim intBlobLen As Integer = Convert.ToInt32(Row.body.Length) Dim intFinish As Integer = intBlobLen - 1
Dim bytBlob(intFinish) As Byte bytBlob = Row.body.GetBlobData(0, intFinish)
Dim strBlob As String = System.Text.Encoding.Unicode.GetString(bytBlob) strBlob = Replace (strBlob,"/n","€°")
If strBlob.Length > 0 Then Row.body.ResetBlobData() Row.body.AddBlobData(System.Text.Encoding.Unicode.GetBytes(strBlob)) End If
I have a package that has a data lfow task. this task imports data from a db2 database (using the IBM Ole DB provider fro db2) and adds it to sql server database table. This package was created on the server. then though version control (using TFS source control) I check out the package on my local machine. and when I open the package I get the foll 3 errors.
Error 1 Validation error. Import Account Num from BMGP_BDR: DTS.Pipeline: The component metadata for "component "DataReader Source" (1113)" could not be upgraded to the newer version of the component. The PerformUpgrade method failed.
Error 2 Error loading BMAG Download Xref Tables - bmag.dtsx: Microsoft.SqlServer.Dts.Pipeline.ComponentVersionMismatchException: The version of component "DataReader Source" (1113) is not compatible with this version of the DataFlow. [[The version or pipeline version or both for the specified component is higher than the current version. This package was probably created on a new version of DTS or the component than is installed on the current PC.]] at Microsoft.SqlServer.Dts.Pipeline.ManagedComponentHost.HostCheckAndPerformUpgrade(IDTSManagedComponentWrapper90 wrapper, Int32 lPipelineVersion)
Error 3 Error loading BMAG Download Xref Tables - bmag.dtsx: The component metadata for "component "DataReader Source" (1113)" could not be upgraded to the newer version of the component. The PerformUpgrade method failed.
I have a package which reads an Access file from a folder. My connection manager to this file is .NET providers for OledbMicrosoft Jet 4.0 OLE DB Provider.
Package works from my computer. But when I execute it on the server as a SQL Agent job, I get
The component metadata for "component "DataReader Source" (1) could not be upgraded to the newer version of the component. The PerformUpgrade method failed.
I copied the mdb file to a folder on the server which my packages have no problem reading data from.
My packages run under the same domain account as defined in proxies.
I need some advise on how to create a historical database.
What is the best way of doing this? For example, should I create a new row if a column is changed in a row and Time Stamp all record? What happens when I have child tables link to a Header table?
I have been looking on the NET for methods of creating a historical database, but I cant find any.
I have a question about historical table. I have a table in Sql Server that keeps the history of my clients for the last 3 months. This table has a PK : client_code and rep_date_id. I want to do a query that returns me a client and to extract the date when i inregistreted for the first time.
I think about a cursor, but i don't know how to use it. My table looks like this structure: - client_code -activity_code -country_code -district_code -... -rep_date_id_n
And I want to return --activity_code -country_code -district_code -... -start_date -current_date
A general data design question:We have data which changes every week. We had considered seperatinghistorical records and current records into two different tables withthe same columns, but thought it might be simpler to have them alltogether in one table and just add a WeekID int column to indicatewhich week it represents (and perhaps an isCurrent bit column to makequerying easier). We have a number of tables like this, holding weeklydata, and we'll have to query for historical data often, but only backthrough the last year -- we have historical data going back to 1998 orso which we'll rarely if ever look at.Is the all-in-one-table approach better or the seperation of currentand historical data? Will there be a performance hit to organizing datathis way? I don't think the extra columns will make querying too muchmore awkward, but is there anything I'm overlooking in this?Thanks.
From what I've read this is called 'slowly changing dimensions'. Bassically the system I'm working on needs to store the history of certain data so that at any time a user can look up an old project and view it exactly as is, even though the associated parts might have had certain changes over time. From what I can tell Type 2 ( current and historical records are stored in the same table) seems to be the most popular. Type 4 (current records in one table and historical records in a seperate history table) seems like it would also work but I've been unable to find any articles comparing the two. Does anybody have any info on the dis/advantages of one v.s. the other?
We have a database that adds over 100,000 records per day. This goes back to 2002 and I only need historical data for 6 months. Presently we can only can delete 1000 row at a time. Is there a faster way of deleting. We seem to continuely run out of disk space. Urgent!!!!!!!!!!!
How to build a TRIGGER that copies, all the time, the data from the table onwhich the transaction occurs to the historical table?thanksFernand---Outgoing mail is certified Virus Free.Checked by AVG anti-virus system (http://www.grisoft.com).Version: 6.0.659 / Virus Database: 423 - Release Date: 2004-04-15
I have about 45000 records in a CSV file, which I am using as HTTP request parameters to query a website and store some results in a database. This is the kind of application which runs 24/7, so database grows really quickly. Every insert fires up a trigger, which has to look for some old records based on some criteria and modify the last inserted record. My client is crazy about performance on this one and suggested to move the old records into another table, which has exactly the same structure, but would serve as a historical table only (used to generate reports, statistics, etc.), whilst the original table would store only the latest rows (so no more than 45k at a given time, whereas the historical table may grow to millions of records). Is this a good idea? Having the performance in mind and the fact that there's that trigger - it has to run as quickly as possible - I might second that idea. Is it good or bad? What do you think?
I read a similar post here, which mentioned SQL Server 2005 partitioning, I might as well try this, although I never used it before.
In a Data Flow, I have the necessity to use a SSIS variable of type €œObject€? inside Script Component and assign to it the content of 'n' variables of string type. On exiting from the script the variable of type object should contain something like in the following lines: AAAAAAAAAAAAAAAAAAAAAAAAAAAAA BBBBBBBBBBBBBBBBBBBBBBBBBBBBB CCCCCCCCCCCCCCCCCCCCCCCCCCCCC DDDDDDDDDDDDDDDDDDDDDDDDDDDDD €¦€¦€¦€¦€¦€¦€¦. €¦€¦€¦€¦€¦€¦€¦. On exiting from the data flow I will use the variable of type Object in a Script Task, by reading each element in a cyclic fashion. Is there anyone who have experienced something like this? Could anyone provide any example of that? Thanks in advance!
Hi all I'm into a project which uses a lot of views for joining 2 or more tables. Using the MERGE component in SSIS will be a huge effort coz it only has 2 inputs and I gotta SORT the input too. Isnt it possible to have a VIEW like component that joins more than 2 tables and DOESNT need sorting?? (I've thought about creating views in database engine but it breaks my data floe in SSIS and is'nt a practical solution)
I am writing a custom dataflow transformation component and I need to get the name of the preceeding component.
I have been trying to find a way to get a reference to the Package object, MainPipe object or IDTSPath90 object (connecting to the IDTSInput90 of my component) from my component because I think from there I can get to the information I want.
I have DWH where users query on regular basis. I wanted to know what queries they run and which user fired the queries for the last one month or 6 months.
I have a SQL2005 db for tracking the prices of products at multiple retailers. The basic structure is, 'products' table lists individual products, 'retailer_products' table lists current prices of the products at multiple retailers, and 'price_history' table records when the price of a product changes at any retailer. The prices are checked from each retailer daily, but a row is added to the 'price_history' only when the price at the retailer changes.
I have the following query to retrieve the price history of a given product at multiple retailers:
SELECT price_history.datetimeofchange, retailer.name, price_history.price FROM product, retailer, retailer_product, price_history WHERE product.id = 'b486ed47-4de4-417d-b77b-89819bc728cd' AND retailer_product.retailerid = retailer.id AND retailer_product.associatedproductid = product.id AND price_history.retailer_productid = retailer_product.id
This gives the following results:
2008-03-08 Example Retailer 22.3 2008-03-28 Example Retailer 11.8 2008-03-30 Example Retailer 22.1 2008-04-01 Example Retailer 11.43 2008-04-03 Example Retailer 11.4
The question(s) I have are how can I:
1 - Get the price of a product at a given retailer at a given date/time For example, get the price of the product at Retailer 2 on 03/28/2008. Table only contains data for Retailer 1 for this date, the behaviour I want is when there is no data available for the query to find the last data at which there was data from that retailer, and use the price from that point - i.e. so for this example the query should result in 2.3 as the price, given that was the last recorded price change from that retailer (03/08/2008).
2 - Get the average price of a product at a given retailer at a given date/time In this case we would need to perform (1) across all retailers, then average the results
I've got a customer who wants reproducible/historical reporting. The problem is that the underlying data changes.
I tried to explain that this can't be done (can it?), but he doesn't understand.
To illustrate the situation - Let's say a teacher wants to track spelling test scores for her students. The below are scores for students A, B, and C (for January, February, March)
A: {70,80,85} B: {70,65, 80} C: {100,90,100}
So, I can generate a historical report that charts the class average and student trend - that's pretty easy.
Now, in April, we find that the school board has mandated that the British spelling of words is ok, so now the cumulative scores (for January, February, March, April)
He wants a report showing the January average as (70+70+100)/3 = 80, when really it is (90+80+100)/3 = 90.
Now imagine that there are actually thousands of data points changing like this... Now also imagine that we add and remove students on a regular basis...
He and his office manager get frustrated when I explain that the reports are not simple - in their mind it is. They have determined the solution is to get a report writer and buy Crystal Reports... I've tried to explain that the problem is that the report specification is unclear (basically - they don't understand what they want). The situation is ok for now, I'm just trying to plan for when they figure out that buying Crystal Reports won't change their situation (except they are done several thousand dollars)...
I have a simple table with three fields: ID, LastName, FirstName. The ID is defined as the PK. In the table is a record of "12345, Smith, John". The incoming flat file has a record of "12345, Smith, Johnny".
In the SCD transform, the ID is the business key, and Last Name and First Name are defined as historical attributes.
During the load, the SCD transform correctly sends the data down the right path, but the insert fails with a primary key violation - as I would expect since it's trying to create a new current record.
How do I get around this problem without removing the PK ???
I have a database which contains time series data (historical stock prices) which I have to search for patterns on a day to day basis. But searching this historical data for patterns is very time consuming not only in writing the complex t-sql scripts but also executing them.
Table structure for one min data: [Date] [Time] [Open] [High], [Low], [Close], [Adjusted_Close], [MA], [DI]..... Tick Data: [Date] [Time] [Trade] Most time consuming queries are with lots of inner joins. So for example if I have to compare first few mins data then I have to do inner join like: With IntervalData AS ( SELECT [Date], Sum(CASE WHEN 1430 = [Time] THEN [PriceRange] END) AS '1430', Sum(CASE WHEN 1431 = [Time] THEN [PriceRange] END) AS '1431', Sum(CASE WHEN 1432 = [Time] THEN [PriceRange] END) AS '1432' FROM [INDU_1] GROUP BY [Date] ) SELECT [Date] ,[1430], [1431], [1432], [1431] - [1430] As 'Range' from IntervalData WHERE ([1430] > 0 AND [1431] < 0 AND [1432] < 0) OR ([1430] < 0 AND [1431] > 0 AND [1430] > 0) ------------------------------------------------------------------------ select ind1.[Time], ind1.PriceRange,ind2.[Time], ind2.PriceRange from INDU_1 ind1 INNER JOIN INDU_1 ind2 ON ind1.[Time] = ind2.[Time] - 1 AND ind1.[Date] = ind2.[Date] where (ind1.[Time] = 2058) AND ((ind1.PriceRange > 0 AND ind2.PriceRange >0) OR (ind2.PriceRange < 0 AND ind1.PriceRange < 0)) ORDER BY ind1.[Date] DESC; Is there anyway I can use Sql 2005 Data mining models to make this searching faster?
No idea where this bug crept in from. Have been using SSIS for 1.5 years now without hitting this problem.
I had a script component opening an XML document and parsing it using XPATH. I added some code that uses StreamReader / Streamwriter (closing one stream before starting the other). The code works without issue in my C# app.
And it ran without issue 2-3 times in SSIS. Then suddenly after running my package again, the script component says it completes successfully, yet nothing happens. I set a breakpoint on the first line of code - it never hits it. I add a msgbox as the first line of code - and it never displays.
I then close my package / exit out of ssis ... and then re-open it. When i open my script component, all of my code is GONE. All references that I added are gone.
I tried adding the streamreader/writer process to a dll I created from my c# app ... and added the DLL to the package -- same result.
I can reproduce this on 2 different computers.
Anyone experience this problem ? Any idea how to stop it ? Or debug it ?
Here is a slimmed down code sample of what causes the error :
Public Class ScriptMain Public Sub Main() Try Dim xmlDoc As New XmlDocument xmlDoc.Load("c:ulkasync_86281519_20070628045850225_4.xml") MsgBox("xmlLoaded") --this doesn't display once the package starts "acting up" Catch ex As Exception MsgBox(ex.Message) UpdateXML("c:ulkasync_86281519_20070628045850225_4.xml", ex.Message) End Try Dts.TaskResult = Dts.Results.Success End Sub Private Sub UpdateXML(ByVal fileName As String, ByVal message As String) Try Dim invalidChar As String = message.Trim().Substring(message.Trim().IndexOf("0x"), 4) Dim rd As StreamReader = New StreamReader(fileName) Dim xml As String = rd.ReadToEnd() Xml = Xml.Replace(invalidChar, String.Empty) xml = xml.Replace("", String.Empty) xml = xml.Replace("<![CDATA[<![CDATA[", "<![CDATA[") xml = xml.Replace("]]>]]>", "]]>") MsgBox("replaced") rd.Close() Dim wr As StreamWriter = New StreamWriter(fileName) wr.Write(xml) wr.Close() Dim xdoc As XmlDocument = New XmlDocument() xdoc.Load(fileName) Catch ex As Exception UpdateXML(fileName, ex.Message) End Try End Sub End Class
I am working on a project, which involves displaying trends of certain aggregate values over time. For example, suppose we want to display how the number of active and inactive users changed over time.
One issue is how to store historical data. First of all, should I create a separate database for each historical snapshot or should I use one database for all snapshots? Second, our database size is a couple of gigabytes and replicating the entire database on a daily basis is not feasible. An alternative solution is to back up aggregate values, but how do I back up results of aggregate queries, where the user can specify a date range in the WHERE-clause? Another solution is to create fact tables from our relational schema and back those up.
Another issue is how to query historical data. Using multiple databases to store historical snapshots makes it harder to query.
As you can see there are several design alternatives and I would like to know how this sort of problem is generally solved in the industry. Does SQL Server provide any support for solving this problem?
My application is to capture employee locations.Whenever an employee arrives at a location (whether it is arriving forwork, or at one of the company's other sites) they scan the barcode ontheir employee badge. This writes a record to the tblTSCollected table(DDL and dummy data below).The application needs to be able to display to staff in a control roomthe CURRENT location of each employee.[color=blue]>From the data I've provided, this would be:[/color]EMPLOYEE ID LOCATION CODE963 VB002964 VB003966 VB003968 VB004977 VB001982 VB001Note that, for example, Employee 963 had formerly been at VB001 but wasmore recently logged in at VB002, so therefore the application is notconcerned with the earlier record.What would also be particularly useful would be the NUMBER of staff ateach location - viz.LOCATION CODE NUM STAFFVB001 2VB002 1VB003 2VB004 1Can anyone help?Many thanks in advanceEdwardNOTES ON DDL:THE BARCODE IS CAPTURED BECAUSE THE COMPANY MAY RE-USE BARCODE NUMBERS(WHICH IS DERIVED FROM THE EMPLOYEE PIN), SO THEREFORE THE BARCODECANNOT BE RELIED UPON TO BE UNIQUE.THE COLUMN fldRuleAppliedID IS NULL BECAUSE THAT PARTICULAR ROW HAS NOTBEEN PROCESSED. THERE ARE BUSINESS RULES CONCERNING EMPLOYEE HOURSWHICH OPERATE ON THIS DATA. ONCE A ROW HAS BEEN PROCESSED FORUPLOADING TO THE PAYROLL APPLICATION, THE fldRuleAppliedID COLUMN WILLCONTAIN A VALUE. IN THE PRODUCTION SYSTEM, THEREFORE, ANY SQL ASREQUESTED ABOVE WILL CONTAIN IN ITS WHERE CLAUSE (fldRuleAppliedID IsNULL)if exists (select * from dbo.sysobjects where id =object_id(N'[dbo].[tblTSCollected]') and OBJECTPROPERTY(id,N'IsUserTable') = 1)drop table [dbo].[tblTSCollected]GOCREATE TABLE [dbo].[tblTSCollected] ([fldCollectedID] [int] IDENTITY (1, 1) NOT NULL ,[fldEmployeeID] [int] NULL ,[fldLocationCode] [varchar] (50) COLLATE SQL_Latin1_General_CP1_CI_ASNULL ,[fldTimeStamp] [datetime] NULL ,[fldRuleAppliedID] [int] NULL ,[fldBarCode] [varchar] (50) COLLATE SQL_Latin1_General_CP1_CI_AS NULL) ON [PRIMARY]GOINSERT INTO dbo.tblTSCollected(fldEmployeeID,fldLocationCode,fldTimeStamp,fldBarCode)VALUES (963, 'VB001', '2005-10-18 11:59:27.383', 45480)INSERT INTO dbo.tblTSCollected(fldEmployeeID,fldLocationCode,fldTimeStamp,fldBarCode)VALUES (963, 'VB002', '2005-10-18 12:06:17.833', 45480)INSERT INTO dbo.tblTSCollected(fldEmployeeID,fldLocationCode,fldTimeStamp,fldBarCode)VALUES (964, 'VB001', '2005-10-18 12:56:20.690', 45481)INSERT INTO dbo.tblTSCollected(fldEmployeeID,fldLocationCode,fldTimeStamp,fldBarCode)VALUES (964, 'VB002', '2005-10-18 15:30:35.117', 45481)INSERT INTO dbo.tblTSCollected(fldEmployeeID,fldLocationCode,fldTimeStamp,fldBarCode)VALUES (964, 'VB003', '2005-10-18 16:05:05.880', 45481)INSERT INTO dbo.tblTSCollected(fldEmployeeID,fldLocationCode,fldTimeStamp,fldBarCode)VALUES (966, 'VB001', '2005-10-18 11:52:28.307', 97678)INSERT INTO dbo.tblTSCollected(fldEmployeeID,fldLocationCode,fldTimeStamp,fldBarCode)VALUES (966, 'VB002', '2005-10-18 13:59:34.807', 97678)INSERT INTO dbo.tblTSCollected(fldEmployeeID,fldLocationCode,fldTimeStamp,fldBarCode)VALUES (966, 'VB001', '2005-10-18 14:04:55.820', 97678)INSERT INTO dbo.tblTSCollected(fldEmployeeID,fldLocationCode,fldTimeStamp,fldBarCode)VALUES (966, 'VB003', '2005-10-18 16:10:01.943', 97678)INSERT INTO dbo.tblTSCollected(fldEmployeeID,fldLocationCode,fldTimeStamp,fldBarCode)VALUES (968, 'VB001', '2005-10-18 11:59:34.307', 98374)INSERT INTO dbo.tblTSCollected(fldEmployeeID,fldLocationCode,fldTimeStamp,fldBarCode)VALUES (968, 'VB002', '2005-10-18 12:04:56.037', 98374)INSERT INTO dbo.tblTSCollected(fldEmployeeID,fldLocationCode,fldTimeStamp,fldBarCode)VALUES (968, 'VB004', '2005-10-18 12:10:02.723', 98374)INSERT INTO dbo.tblTSCollected(fldEmployeeID,fldLocationCode,fldTimeStamp,fldBarCode)VALUES (977, 'VB001', '2005-10-18 12:05:06.630', 96879)INSERT INTO dbo.tblTSCollected(fldEmployeeID,fldLocationCode,fldTimeStamp,fldBarCode)VALUES (982, 'VB001', '2005-10-18 12:06:13.787', 96697)
We have an issue in a SCD where a number of records may be presented that have changes to their attributes of EVERY type.
Example,
BusinessKey: xxxxxxxx
BuildingTypeId: 7
BusinessUnitHistoryId: 4019
BusinessUnitId: 4019
CurrencyId: 26
DevelopmentTypeId: 14
MarketId: 182
Name: abcdefgh
CurrencyId is a fixed attribute MarketId & BuildingTypeId and the BusinessUnitId & BusinessUnitHistoryIds are historical attributes Name is a changing attribute
The behaviour of the ETL seems to suggest that if fixed attribute changes are detected, these rows will error and therefore the changing & historical attributes will NOT be amended during the SCD transformation. Is this correct... as it seems to be what is happening.
Is it possible to design a Cube with a Dimension "DimEmployee" (SCD Type2) and Query the following:Â What "title" had the person "xyz" at time "2015-01-01"? - See Example data on Images.
I have also a DimTime with Day granularity. The DimEmployee is not at day granularity.
I'm trying to create a website based on interesting historical facts, Searchable by date. I need each fact entry to have a datetime field in the database, but unfortunately this method only lets me go as far back as the year 1753 (for an inexplicable reason, I can go up to the year 9999).Is there something I'm overlooking here, or did Microsoft goof this one? Will I have to store my dates as 3 separate custom month/day/year fields? If anybody has a solution for this, please let me know (a B.C./A.D enabled solution isn't at all needed, but I'm open to that too).
I have a historical table on a dedicated SQL Server (let's call it the reporting db) that is populated every morning with production data that does not already exist. The data in the prod table is purged after 7 days and nothing is ever deleted from the historical table. I have set up the linked server between the two 2008 SQL Servers, but when I try to run this simple query from the reporting DB, it takes more than 5 minutes and still "executing". I eventually have to cancel it:
-- INSERT INTO Temp_Import_historical SELECT TOP 1 * FROM [192.168.1.100].ProdDB.dbo.Temp_Import_historical a WHERE NOT EXISTS (select [Temp_Import_ID] from Temp_Import_historical where a.[Temp_Import_ID] = Temp_Import_historical.[Temp_Import_ID])
I have omitted the INSERT statement on purpose, since I can't even get to output 1 row. Why this is such a resource intensive query?
I want to load historical data from an old system into a new one.Thing is, that old system stored dates as Datetime and the new one uses DateTimeOffset.
All data was collected in the same Time Zone... but with the Daylight Saving Time (DST)
The offset is either +04:00 or +05:00, based on the calendar date. To add to the complexity, the rules for DST changed a couple of years ago.
To determine the offset, I'd need to know what was or would have been the server Timezone for each historical date.
Recently, I partitioned one of my largest tables into multiple monthly field groups. For the current month, it is attached to my "Active' table. The older records are kept in the "historical" table. I need an efficient way to pull records when have a date range that can be spread across both tables.