History/Data Change File Approach
Mar 4, 2005
I need to record in a table:
Who, When, What Field and New Value of Fields
When changes occur to an existing record.
The purpose is for users to occassionally view the changes. They'll want to be able to see the history of the record - who changed what and when.
I figured I'd add the needed code to the stored procedure that's doing the update for the record.
When the stored procedure is called to do the update, the PK and parameters are sent.
The SP could first retain the current state of the record from the disk,
then do the update, then "spin" thru the fields comparing the record state prior to the update and after. Differences could be parsed to a "Changes string" and in the end, this string is saved in a history record along with a few other fields:
Name, DateTime, Changes
FK to Changed Record: some int value
Name: Joe Blow
Date: 1/1/05 12:02pm
Changes: Severity: 23 Project: Everest Assigned Lab: 204
How does the above approach sound?
Is there a better way you'd suggest?
Any sample code for a system that spins thru the fields comparing 1 temporary record with another looking for changes?
Thanks,
Peter
View 3 Replies
ADVERTISEMENT
Jun 17, 2004
This one is giving me quite a bit more difficulty then I ever imagined it would...
Essentially I would like to use one table to store the change history for multiple tables. I would like to use an update trigger to check which fields have changed in each record, and write a single record for each field that changed containing the table name, field name, previous value and new value to a history table.
I can't seem to find a good way to do this.
View 9 Replies
View Related
Feb 1, 2008
There is a report when you click servername, report and run SCHEMA CHANGE HISTORY
I had my SQL 2005 running for a few weeks and this is many listed from day started is there a way to recycle this and clean it up on a weekly basic
View 5 Replies
View Related
Oct 4, 2007
Hello.
Is it possible to find out a complete history of when the passwords for any SQL Server logins were changed and by what/whom in 2005 standard edition?
Thanks.
View 1 Replies
View Related
Jun 5, 2015
one of my SQL Developer member had one observation that, size of the parameter 'Parameter_XYZ' in certain stored procedure had changed from 25 to 255 during some production fixes, however suddenly its looks like that, someone has changed it back to 25 instead of 255.
DECLARE @Parameter_XYZ
varchar(25);
Can we figure out in which sprint/drop the stored procedure was changed and the Parameter_XYZ back to 25. Can any log recovery mechanism will get such details.
Can we get stored procedure text between different alteration.
View 4 Replies
View Related
Apr 20, 2001
Hi,
I am not sure how to change data and log file name on existing database,
If it is oracle, I just need mount database and use alter database statement
to change data file name or location.
How to do it in SQL7??
Thanks
View 1 Replies
View Related
Jun 3, 2004
Hi All,
Is anyone can help me to change a existing data file into a new new name?
Thanks.
View 4 Replies
View Related
Jun 30, 2015
How can I easily identify who dropped a table?
View 8 Replies
View Related
Nov 24, 2000
Hello
i have a package who export data in a text file,
this package is schedule all the night
how i can change the name of the file dynamically ?
or how the new data can be append at the end of the file ?
Thanks
Cyril Caillaud
View 1 Replies
View Related
Mar 2, 2007
Hi guys,
Suppose I have a flat file that had columns that went like this:
rackId, cdname1, artist1, year1, cdname2, artist2, year2, cdname3, year3, artist3,...cdnameN, yearN, artistN
And I want to map it to a table that has the ff columns:
rack_id | CDName | Year | Artist
Assuming that N is constant and all rows of the flat file will have N or less values filled up per rack, how do I map the fields from the flat file such that cdname1, cdname2 and cdname3 all fall under the CDName column, year1 and so on will fall under the Year column, and artist1 and so on will fall under the Artist column?
View 3 Replies
View Related
Mar 8, 2006
I'm getting a bit lost in SSIS. I've got an Excel source file that I'm trying to load into a table. I keep getting validation errors that warn about not being able to convert between unicode and non-unicode string data types.
I'm trying figure out where I have to change this and am frankly confused. It seems SSIS is selecting various columns as unicode/WSTR data types, but I want them to import as regular string types.
On the Data Flow tab in SSIS, I right-click on the source Data Flow component (the Excel file) and select Show Advanced Editor. Then on the last tab, Input and Output Properties, there's a tree view for the Excel output. There are "External Columns" and "Output Columns" containers in the tree view.
I tried setting some of these but they don't seem to "take". Do I need to change the data type for each column under both the External and Output columns?
That seems like a lot of work! And, as I say, I tried setting some, but I still got the same validation errors. So, then I go back to this spot (Advanced Editor -> Input and Output Properties tab) and my changes seem to have been lost.
Any help would be appreciated!
View 5 Replies
View Related
Nov 9, 2015
I added a secondary data file to TEMPdb yesterday and gave it a wrong location by mistake. If I try to change the location, then I am getting an error now. I think that is because TEMPdb is in use and that is why I cant change it's secondary file's location. Do I need to take TempDB offline and then change the secondary file's location??
View 3 Replies
View Related
May 3, 2007
hi, guys
How to move one table or a ResultSet to a separate database .sdf file in compact edition?
For example, I have handheld installed the compact edition to do some order works. After the job done, I want move the new table Orders to our server or other place. Of course I don't want to move everything in the .sdf file. So, what is the best way to move table Orders out to a sparate .sdf file? (In my case, I can not use RDA or replication. )
Thanks.
View 5 Replies
View Related
Jun 16, 2015
Using SSRS 2008R2 is it possible to change the file extension of a CSV data driven subscription? I'm outputting a text file with a .csv extension, but users have access to these files and I don;t want them opened with Excel and then saved back in an incorrect format.
These are the options I have from the report "manage" option.
View 2 Replies
View Related
Sep 12, 2007
Hi,
I want to encrypt certain data like password, ssn, credit card info etc before saving in database. Also, this encrypted data can be queried using standard SQL statements like:
select * from users where userid=454 and password = 'encrypted data'
The mechanism to encrypt data could be in a .net application. The code that does encryption/decryption should also be protected so that it doesnt work if it falls in wrong hands.
Can anyone suggest what would be the best way to accomplish above?
thanks,
dapi
View 3 Replies
View Related
Feb 23, 2006
Hi!
I wonder what would be the best (at to be honest - how to do it at all) to perform data normalization with SSIS. The scenario is as follows:
I got plain table with several columns in it.Some of columns can be copied straight into destination tableSome columns (String) should be lookup in another table to get IDOn success just replace string with IDOn fail - create new record in lookup table and return newly created ID
Thanks for any ideas and maybe short samples
Anrijs Vitolins
View 1 Replies
View Related
Nov 26, 2007
I have staged my tables in a database which is in the same server as the destination database and they are on sql server 2005.
Now I need to push the data from the staged table to destination.
Which is the best approach in ssis ?
1) using execute sql task to a call stored procedure to push the data to a different database using server.dbo.table name from the stored procedure.
or
2) using dataflows to call a stored procedure and map source and destination.
View 13 Replies
View Related
Sep 27, 2007
Hi , I am loading the Data into the Tables with the constraints on and redirecting the error rows into a seperate table is there a way to capture the error rows from a execute sql task by directly loading data without constraints and later adding them with the execute sql task and redirecting them to error table as this approach would make the loads quicker. the approach now that i am using is on a row by row basis ..... and if i drop constraints and load data and then add constraints will this deposit the same error rows as in case of the current approach please send me ur suggestions
View 3 Replies
View Related
Dec 3, 2006
Hi all,
In an approach of building an ETL tool, we are into a situation wherein, a table has to be loaded on an incremental basis. The first run all the records apporx 100 lacs has to be loaded. From the next run, only the records that got updated since the last run of the package or newly added are to be pulled from the source Database. One idea we had was to have two OLE DB Source components, in one get those records that got updated or was added newly, since we have upddate cols in the DB getting them is fairly simple, in the next OLEDB source load all the records form the Destination, pass it onto a Merge Join then have a Conditional Split down the piple line, and handle the updates cum insert.
Now the question is, how slow the show is gonna be ? Will there be a case that the Source DB returns records pretty fast and Merge Join fails in anticipation of all the records from the destination ?
What might be the ideal way to go about my scenario.. Please advice...
Thanks in advance.
View 13 Replies
View Related
Sep 18, 2007
Hi ,
My Input is a flat file source and it has spaces in few columns in the data . These columns are linked to another table as a foreign key and when i try loading them in a relational structure Foreigh key violation is occuring , is there a standard method to replace these spaces .
what approach should i take so that data gets loaded in a relational structure.
for example
Name Age Salary Address
dsds 23 fghghgh
Salary description level
2345 nnncncn 4
here salary is used in this example , the datatype is char in real scenario
what approach should i take to load the data in with cleansing the spaces in ssis
View 4 Replies
View Related
Jun 26, 2007
I would like to know if there is a way to maintain the history of changes to the reports that have been published to the report server?
I know that the report definitions get saved onto ReportServer database. But let's say a user makes a change to the published report and then saves it back to the server. And that the latest change was incorrect and I have to revert back to the previous version of the published report. Is there a way to do that? Does the report server maintain a history of previous versions.
There is a history for each report and I think that corresponds to the history of report executions (output data). But I am talking about the history of actual report definition.
Thanks for you help.
View 2 Replies
View Related
Feb 1, 2007
i had setup the merge replication across the server but each next morning i find that replication syncronization has stop with following error
The merge process was unable to access row metadata at the 'Subscriber'. When troubleshooting, restart the synchronization with verbose history logging and specify an output file to write to, or use SQL Profiler to determine the source of the failure. (Source: MSSQL_REPL, Error number: MSSQL_REPL-2147200996)
for this i manually restrat the syn. agent by stopping and then starting agent.
plz tell me how can i fix this hell of message or tell me how i call start/stop merge agent on command line so that i make script starting and stoping agent as job and will configure to run every morning.
regards
Ahmad Drshen
View 1 Replies
View Related
Jan 19, 2007
It's well known issue, that one can't use any dataset fields in a
report header/footer directly. One of the approach is to create
query-based parameter that basically equals
=First(Fields!@FieldName@.Value, "@DataSetName@") and use that
parameter value instead. But it doesn't work in my case!
My report displays some entity description and is parametrized with
EntityID param. Its header contains entity name that, according to the
approach, is queried from the data source through the EntityName
report parameter. There's important issue: the report is displayed in
ReportViewer control, that is embedded into my application and entity
ID parameter isn't ser by user in ReportViewer parameters area. Its
default value is changed by the application with SetReportParameters()
web method every time a user wants to view the report according to the
entity the user is exploring in the application. But after the report
has been rendered, its header always contains not actual (outdated)
entity name. Nevertheless, the report body contains actual data
(including entity name). If I alter entity ID parameter in ReportViewer
or in web-based Report Manager and refresh report, header displays
correct entity name.
What's wrong in the workflow described?
View 3 Replies
View Related
Mar 25, 2015
how to get products on which SalesPerson changed. Here is table with data.
View 9 Replies
View Related
Oct 3, 2007
We are using a SQL Server 2000 Replication.
I'm using the Merge Agent History Screen to retrive Informacion about
replication sessions, is there any other screen to know exaclty which datawas replicated on each session?Or at least to know the script generated on each session?
Thanks
View 5 Replies
View Related
Apr 9, 2008
Hi
I have a table called my_history that has columns like this
column_name , old_value, new_value, key, date
bankaccount 30 50 1 01-Apr-2008
bankbalance 10 14 2 04-Apr-2008
and so on............
The history table is populated using triggers
The main table called my has a structure like
bankaccount bankbalance bankname name address key
50 50 xyz abc ford 1
30 14 abc xyz east 2
Now using this information can i reconstruct the records in my table before the update happened ?
I am finding it very difficult to do this is there a way to do this in t-sql ?
The problem is my_history table where the column_name keeps on varrying
regards
Hrishy
View 7 Replies
View Related
Jun 27, 2014
I have a table history of Employee data.
id | EmpNo | EmpName | MobileNo | Email | EmpSSS | UpdateDate | UpdateUser
I have to make a stored procedure that will show the history and changes made to a given EmpNo, with the UpdateDate, UpdateUser and indicate which field is modified. Ex. Employee Mobile number was changed from '134151235' to '23523657'.
Result must be:
EmpNo | UpdateDate | UpdateUser | Field changed | Change from | change to
View 4 Replies
View Related
May 27, 2008
Hi,
I have a package A which is copied from another existing package B as most of the data structure and ETL mappings are same.
What I need to change in Package A is to change the file in Flat File Connection Manager. I can change it in the conneciton manager editor. However, it is automatically changed back to the previous one everytime when I try to Save All or run the package.
I also tried to copy/paste this Flat File Connection Manager in the same package but samething happen. The package file is not set 'Read Only" so anything else can Saved well except for the File name in connection manager.
Is this a bug? It would be very appreciated if anyone can give me any idea about this.
Thanks,
Jenny
View 7 Replies
View Related
Mar 28, 2006
Hi,
we have one requirement to run the package daily basis.
The package should run at specific time on that day.
we are using windows schedular for that.
we will have one new flatfile everyday.
Is there any process to attach this file to flat file source dynamically?
The requirement is,
The flat file should be able to read the new flatfile everyday.
we have no option change it manually, the flatfile source should have to take the file automatically at that time.
So that it can take that flatfile and load it into database table.
View 1 Replies
View Related
May 12, 2015
We have a need to report on historical data when none exists in the database.
Create a tabular report rdl that is run on a regular schedule. This report run is saved as an Excel sheet that overwrites the previous run, which has the same name.
Report is designed to use two data sources--the database and and the previously run Excel sheet. The data is "merged" using the Lookup function thereby creating a new report that includes the history needed.
View 2 Replies
View Related
May 2, 2006
Hi,
How can I dynamically change the file name in File connection Manager in SSIS package?
I can do this in SQL 2000 but how to achieve the same thing in SQL 2005. I have to generate 10 different excel file and just need to change the file name in connection manager for excel file
Thanks
Shafiq
View 4 Replies
View Related
Feb 9, 2007
Hi,
My scenario:
I have a master securities table which has 7 fields. As a part of the daily process I am uploading flat files into database tables. The flat files contains the master(static) security data as well as the analytics(transaction) data. I need to
1) separate the master (static) data from the flat files,
2) check whether that data is present in the master table, if not then insert that data into the master table
3) If data present then move that existing record to an history table and then update the main master table.
All the 7 fields need to be checked to uniquely identify a single record in the master table.
How can this be done? Whether we can us a combination of data flow items or write a sql procedure to do all this.
Thanks in advance for your help.
Regards,
$wapnil
View 4 Replies
View Related
Jul 22, 2003
I want to automate my performance monitoring Reports(sql2000). So I need to import performance monitor output log file .CSV to a table. I feel converting this .CSV to .XLS is good way to importing the log thru DTS. Is there any command/ script to change the file type .CSV to .XLS, so that i can include this one into DTS Step.
View 6 Replies
View Related