Suggestions On Maintaining Audit Fields
Apr 14, 2008
Currently all of our tables in several databases have the following columns:
user_added (this is nvarchar)
host_added (this is nvarchar)
date_added (this is datetime)
user_modified (this is nvarchar)
host_modified (this is nvarchar)
date_modified (this is datetime)
Right now our policy is that (a) the _added columns use defaults to populate the data on INSERTS and triggers are generated to update the _modified fields upon an UPDATE of the table.
Our practice has been (a) to manually create these fields in our scripts as we create new tables in our system and (b) create triggers to perform the update anytime we create a new table.
This practice has been fine until recently where we have been outsourcing some of our development and not all of our standards have been adhered to, including this one. I'd like to look at alternatives for somehow maintaining these concepts outside of our development workflows.
The first thing I'd like to inquire about is regarding options to eliminate having developers include these columns in the CREATE TABLE statements. Is it possible in SQL Server 2005 to capture when an CREATE TABLE statement is executed and override/append to the initial CREATE TABLE statement?
The second thing I'd like to inquire about is regarding options to eliminate having developers write the initial trigger that maintains the _modified fields. I guess if there are options to capture when a CREATE TABLE statement is executed, we could possibly generate a CREATE TRIGGER statement against that object as well?
Another idea I would like thoughts on are using some sort of 'table inheritence' to store this information for all objects in our database? This idea come up when I saw this article - http://www.sqlteam.com/article/implementing-table-inheritance-in-sql-server. Do you think the situation I explained here would fall into this concept?
I'm also open to any other thoughts and/suggestions.
View 2 Replies
ADVERTISEMENT
Feb 3, 2004
Hi guys... I need to do some statistical reporting on the answers to registration questions. The answers are stored in pipe-delimited varchar fields to allow for one-or-many answers to a single question... here's an example:
Q: What are your fav colors?
A: [checkbox list of colors]
Now the data is stored like this:
Reg001
-----------
red|blue|green|yellow
Now I need to determine how many people selected each individual answer, so like, how many people picked red, how many people picked yellow.... to report like this:
Red - 12
Blue - 23
Green - 15
Yellow - 9
What is the best, or suggested way, to go about splitting this data up and reporting on it? Can it be done in T-SQL, or will it need to be a combination of T-SQL and .Net language?
Thanks!
-e
View 1 Replies
View Related
Nov 1, 2014
I have made a server security audit and specify from database audit specification to audit "select" on a certain user and on a certain table. I logged in by this user and made the select statement..when i run this query
"select * from sys.fn_get_audit_file('d:Auditaudit1*',null,null)"
It return a value at which time the query has done
after 15 minutes i repeated the same action, i run the audit query and the same result is showed off on the panel.is it suppose to return a list of values by how many times this user has made the select statement on that table ? for example at 5:00 pm then 6:00 pm and so on
View 1 Replies
View Related
Jan 18, 2006
I need help...here is the problem.Last weekend, the servers in our datacenter where moved around. After thismove, and maybe coincidental, 1 server is performing very poor. Afterrunning a trace with SQL Profiler, I saw the problem which was laterconfirmed with another tool for SQL server performance monitoring. It seemsthat all connections to the SQL server (between 200 - 400) are doing a login/ logout for each command that they process. For example, the user'sconnection will login, perform a SELECT, and then logout. This is not a..NET application. The client software was not changed, it is still thesame. The vendor has said that it is not supposed to do that, it issupposed to use 1 connection that log's on in the morning and logs off atthe end of the day or whenever the user exits. 1 user may have severalconnections to the database.At times, the server is processing over 250 login / logouts (avgeraged for30 second period). Has anyone seen this problem? I have the server inAUDIT FAILUREs only. The server has become very unresponsive, things thattook 3 seconds now take over 15 seconds.Any ideas???
View 6 Replies
View Related
Jun 11, 2008
hi ,
i am working on an application using c#, visual studio 2005, sql server 2005.
i have a few tables in sql server 2005.
i need to save the history. (i.e) all the inserts, updates, and deleats performed on the tables.
can any one suggest me how can i achieve that.
should i use triggers and save the changes in another table ???
waiting for your suggestion??
thank you
View 13 Replies
View Related
Oct 22, 2000
Please what is the best way to perform index maintenance. I use 7.0
We have been having slow server performance, and one of the options is to do index maintenance. I have researched but could not get a clear picture of what I should do. Has anybody performed the same task before? Thanks for your help!!!
View 1 Replies
View Related
Apr 5, 2004
Hello Friends,
Iam new to this sql server arena. I have implemented a procedure which does a series of insert and update statements and all of this statements must be implemented all at once or none. But if I got error in some statements , the rest of the statements are been executed. Please suggest me a way or code snippet to achieve atomicity in a sqlserver procedure.
regards,
Ch.Praveen Kumar.
View 4 Replies
View Related
Jan 29, 2008
Scenario:
For the most part we let SQL Server (2005) maintain our statistics for us. However we do have several large processes written in stored procedures. There is one main controller procedure that can call any number of other procedures. This process can take anywhere from 5 minutes to an hour+ to run (based on the size of the client). Back in the day of SQL Server 2000 we found that the performance of this procedure would diminish over time (while it was running). We implemented a queued concept of issuing UPDATE STATISTICS commands. This was done by adding a SQL Server job that ran every 10 minutes looking for new records in a table. Records where inserted at key points in these stored procedures (after large deletes, updates, inserts).
Goal:
Now, with all that background and with 2005, I'd like to review this concept and remove this implementation if possible, or at least remove the close association of maintaining the statistics from the business jobs. In 2005, are there better ways to monitor and maintain statistics at more of an administrative (but automated) way?
View 15 Replies
View Related
Feb 27, 2007
Our database(s) are all over the place - no documentation - lot's ofrubbish and unused stuff.I'm managing a project focusing on data quality that covers codechanges, alterations to DTS packages, schema changes etc etc.What I'd like to do is see where the bit I want to change is beingused.that might mean what stored procs use a field and what sprocs use thatsproc.maybe it's which dts packages use a sproc (and again up thehieararchy)The list is a long one but basically I need to know what the effectsare of changes.Is there a tool out there that lets me navigate a database to thatlevel of detail - I understand something along the same lines isavailable for MS Access but I can't find it for SQL Server.Thanks
View 1 Replies
View Related
May 24, 2006
I am a beginer in SQL Server. I have developed a simple accounting application in VB and SQL. Now I have successfully completed my application. Now I want to deploy it to my client. So I installed SQl Server and required VB components in the clients computer. I also created 'sa' login and secret password only know by me. I thought my data in that clients computer was full safe but later on i found that we can also connect to the sql server using the NT administrative account and easily change the data of the database. So now I am worried that if someone enters and access the clients computer with administrator's password then he/she can change my data resulting the corruption of the data. So is there any way that I can prevent the access the database to the client with the NT administrative account or any way 2 track the way the data changed?
View 5 Replies
View Related
Apr 25, 2008
hi all,
i am working on portal site where i have created 18 tables in on database, i dont know weather i am right or wrong . Should i continue with the same or create two tables one will be master and another will contain common fields.
but if i will create one table for all then what will happend.
please tell me what to do and why?
asap please............
Thanks for spending ur valuable time for me.
View 4 Replies
View Related
Jul 20, 2005
Hello,I am fairly new at stored procedures. I have created some that willgo through a table and return a start date and an end date that isdependent upon the fiscal period you want, but I then need to usethose dates in another stored procedure to retrieve the information Ineed. My stored procedure looks like this.================================================== ====================CREATE PROCEDURE dbo.R920ExtTotal@MthsBack Decimal OUTPUTASDECLARE @sSQL AS NVARCHAR(255), @StartDate as SMALLDATETIME, @EndDateas SMALLDATETIMEExec @StartDate = GetMthStart @MthsBackExec @EndDate = GetMthEnd @MthsBackSET @sSQL = 'Select count(extension) as Total From r920f00 Where([date] BETWEEN "' +CONVERT(nvarchar, @StartDate) +'" and "' +CONVERT(nvarchar, @EndDate) +'")'Select @sSQLEXEC (@sSQL)ReturnGO================================================== ===============The problem is my variables @StartDate and @EndDate do not retaintheir values after the EXEC statement and revert to 01/01/1900. Howcan I get around this problem?Thanks!!!!Chip
View 2 Replies
View Related
Jul 20, 2005
Hello,I have a large set of data that I have set up as a partitioned view.The view is partitioned by a datetime column and the individual tableseach represent one month's worth of data. I need to keep at least twoyear's worth of data at all times, but after two years I can archivethe data. A sample of the code used is below. It is simplified forspace reasons.My question is, how do other people maintain the database in this typeof scenario? I could create all of the tables necessary for the nextyear and then go through that at the end of each year (archive tablesover two years, add new tables, and change the view), but I was alsothinking that I might be able to write a stored procedure that runsonce a month and does all three of those tasks automatically. It seemslike a lot of dynamic SQL code though for something like that.Alternatively, I could write VB code to handle it in a DTS package.So, my question again is, how are others doing it? Any suggestions?Thanks!-Tom.CREATE TABLE [dbo].[Station_Events_200401] ([event_time] [datetime] NOT NULL ,[another_column] [char] (8) NOT NULL )GOCREATE TABLE [dbo].[Station_Events_200402] ([event_time] [datetime] NOT NULL ,[another_column] [char] (8) NOT NULL )GOCREATE VIEW Station_EventsASSELECT event_time,another_columnFROM Station_Events_200401UNION ALLSELECT event_time,another_columnFROM Station_Events_200402GO
View 3 Replies
View Related
Oct 29, 2007
Hi,What is the preferred way to maintain SQL tables on a remote host?I am a newbie to building ASP.NET websites on a remote host.A stumbling point has been the maintenance of SQL tables on the remote host.I understand about doing complete backup and restores,but I am seeking a quicker way to maintain individual files.I would like to click and edit but instead am going through the following 30+ clicks.Is there a easier way?Thanks.
For example, what I do now to build a new data table for a hosted website.1) Design table 1a) Name 1b) Fields & Types
2) SQL Server Management Studio Express (assuming existing database) 2a) Select Database & Tables 2b) Add new table 2c) Add fields, Key must be INT for ACCESS 2d) Save as (Name_Table)
3) MS Access (requires ODBC to be setup first through the Windows control panel) 3a) Tables / New / Link / ODBC /Machine_Data_Source 3b) Pick table 3c) Edit data, as needed
4) To transfer data, first:Select the database in the VWD solution explorer, then right-click and select the new "Publish to Provider" 4a) Database Publishing Wizard 4b) Choose table to script a backup from 4c) Build script & Copy
5) Start Ipswitch FTP ( this step can be rplaced by 6e below) 5a) locate folder & sql script file and choose destination directory 5b) Transfer file
6) Login to remote host host (1and1) 6a) MS SQL Administration 6b) Admin (MyLittleTools Admin) 6c) Tools 6d) Quey Analyser 6e) Paste script (from step 4) 6f) Submit (Run) 6g) Verify table built
FYI: Script to build and populate the new table "Name_Table"Built by step 4c above, pasted into remote Hosts Query Analyzer by step 6e above.
/****** Object: Table [dbo].[Name_Table] Script Date: 10/28/2007 18:03:58 ******/IF EXISTS (SELECT * FROM sys.objects WHERE object_id = OBJECT_ID(N'[dbo].[Name_Table]') AND type in (N'U'))DROP TABLE [dbo].[Name_Table]GO/****** Object: Table [dbo].[Name_Table] Script Date: 10/28/2007 18:03:58 ******/SET ANSI_NULLS ONGOSET QUOTED_IDENTIFIER ONGOIF NOT EXISTS (SELECT * FROM sys.objects WHERE object_id = OBJECT_ID(N'[dbo].[Name_Table]') AND type in (N'U'))BEGINCREATE TABLE [dbo].[Name_Table]( [ID] [int] NOT NULL, [Name] [nvarchar](50) COLLATE SQL_Latin1_General_CP1_CI_AS NULL, [Address] [nvarchar](50) COLLATE SQL_Latin1_General_CP1_CI_AS NULL, [City] [nvarchar](50) COLLATE SQL_Latin1_General_CP1_CI_AS NULL, [State] [nvarchar](50) COLLATE SQL_Latin1_General_CP1_CI_AS NULL, [Zip] [nchar](10) COLLATE SQL_Latin1_General_CP1_CI_AS NULL, [Acsz] [nvarchar](50) COLLATE SQL_Latin1_General_CP1_CI_AS NULL, [Phone] [nvarchar](50) COLLATE SQL_Latin1_General_CP1_CI_AS NULL, [Fax] [nvarchar](50) COLLATE SQL_Latin1_General_CP1_CI_AS NULL, CONSTRAINT [PK_Name_Table_1] PRIMARY KEY CLUSTERED ( [ID] ASC)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS =
ON))ENDGOINSERT [dbo].[Name_Table] ([ID], [Name], [Address], [City], [State], [Zip], [Acsz], [Phone], [Fax]) VALUES (1, NULL, NULL,
NULL, NULL, NULL, NULL, NULL, NULL)INSERT [dbo].[Name_Table] ([ID], [Name], [Address], [City], [State], [Zip], [Acsz], [Phone], [Fax]) VALUES (2, NULL, NULL,
NULL, NULL, NULL, NULL, NULL, NULL)INSERT [dbo].[Name_Table] ([ID], [Name], [Address], [City], [State], [Zip], [Acsz], [Phone], [Fax]) VALUES (3, N'Third
name', NULL, NULL, NULL, NULL, NULL, NULL, NULL)
View 1 Replies
View Related
Apr 18, 2003
I am wondering how people maintain their SQL Servers which run at several customers sites and disk space is getting smaller and smaller? I want to say that we have tables in SQL dbs which hold a lot of date consisting of statistics, errors, logs etc.
They grow and grow and existing data is not needed anymore as soon as the data get older than let's say for one year. How do you overcome the problem reducing the tables but not charging the system too much as the major application also runs on the same server?
Thanks for any input
mipo
View 1 Replies
View Related
Dec 8, 2005
Hi !
I need to maintain a record such as how many time any user (e.g, sa) connects to the sql server. Means whenever any person is connecting to the database through application or directly, then i need to know that through which sql user(e.g sa), any body connected.
Regards,
Shabber Abbas Rizvi.
View 1 Replies
View Related
Apr 4, 2007
Hello All!
I have a asp.net website with SQL 2005 DB .
DB size of 1.5GB with ~10 tables in it. The largest table has 200k of records in it (website users table), with 500 new records every day.
I've setup this database 4 months ago and didn't touch it since then.
I really have no knowledge what SQL needs in terms of index maintenances / hard drive maintenances.
Lately , the website searches started to be really slow , and I started to get timeout error and deadlock errors.
I have a few indexes for each table based on the recommendation MS-SQL Database tuning advisor gave me.
Some of the index's are :
Page fullness : 99%
Total Fragmentation: 24%
Other are :
Page fullness : 65%
Total Fragmentation: 99%
I guess I need to start maintaining the DB , defragmenting index or hard-drive?
Can anyone help me and provide me with guide/information on what is needed to be done in order to keep SQL running fast and happily?
or a guide on defragmenting index's and how ofen do i need to defrag?
Thanks,
Shar
View 15 Replies
View Related
Aug 7, 2007
If you have a "Orders" table that is being sync'd to subscribers that are ocassionaly offline, and the subscribers add rows to their local Orders table. When they go online to sync with the published "Orders" table, how do you handle keeping the "OrderId" field unique?
Example:
Both salespeople sync the following data down:
OrderId Desc
1 Order 1
2 Test Order
Both salespeople go offline and add orders
Salesperson 1 adds:
OrderId Desc
3 Joes Order
Salesperson 2 adds:
OrderId Desc
3 Kathys Order
Now, when they go back online, they both will sync their orders up to the main database and they both have the OrderId of 3.
View 3 Replies
View Related
May 4, 2007
Calling all those that use Maintenance Plans.
There are some perculiar goings on happening on my servers. Plans across servers have been doing funny things like dumping back ups in different folders, some jobs have been hanging, so on and so forth.
And I think it's occuring after modifying SQL Server 2000 plans with SSMS. It's so intermitent, it's hard to put my finger on exactly when it's happening. It may have not even been reported yet. Fixing the anomolies are achieved by recreating the jobs (a matter of unticking and ticking the boxes in the Maintenance Plans).
In SQL 2000, if you create more than 1 schedule on a job that was itself created using a maintenance plan, SQL Server returns a message along the lines of "Dude, do this and weird things might happen". The message doesn't say what exactly, just that it can no longer guarentee the integrity of the plan. And indeed, weird things does happen, I tested it once.
I'm guessing this is a similar problem, but has not yet been pinned down. Has anyone come across this when using SSMS to maintain 2000 plans.
At this stage, I would like to point out that this is not a slagging off Maintenance Plans thread... I'm looking in the direction of anyone who's name starts with the letter Kristen or Tara
Drew
"It's Saturday night; I've got no date, a two litre bottle of Shasta, and my all-Rush mix tape... LET'S ROCK."
View 1 Replies
View Related
May 28, 2007
I am using RGB HEX #s in my report for some colors. It shows up fine when rendering to a browser, however when exporting to Excel my custom colors don't come over. I've tried creating a report.xlt file with my custom colors in it, but when I export to Excel it uses the standard Excel palette instead of the custom palette in my report.xlt.
Any ideas on how to retain my custom colors when exporting to Excel?
View 1 Replies
View Related
May 24, 2007
Hi,
I am very new to SQL Server and am just starting to look into replication as a possible solution to a problem I have. I have a local database which will be running on either MSDE or SQL Express 2005. I have a database on a webserver running SQL Server 2005 Standard which I wish to keep as an exact copy of the MSDE/Express one. If data is changed in the local db then I want the webserver db to be updated with this data, preferably as the change occurs, but it could be on an hourly schedule if required.
If data in the webserver db is altered then I do not wish for it to update the local db, but I would like it to be assumed that the change is erroneous and reset the item to whatever exists in the local db. In other words, should a table be erroneously dropped or something from the webserver db then the table would be recreated when it syncs with the local db (rather than propogating the change back to the local db and thus losing local data).
Basically I want the web db to be an exact mirror of the local db as often as possible. Given that the webserver is running SQL Server 2005 Standard and the local machine will be running either MSDE or SQL EXPRESS, is this achievable and what would be the best way of achieving it? Should I use merge replication or database mirroring? I have succeeded in getting 1-directional merge replication working by setting the webserver up as a publisher and distributor and setting up a push merge subscription on the local machine with an ExchangeType of 1. However, this won't correct the data in the web db if it is compromised until it is altered locally.
It crossed my mind that I could write a small application to utilise tablediff to compare each table in the DBs and get it to correct the web db if required on a regular basis. Would this be a silly approach?
I would appreciate expert advice!
Max
View 1 Replies
View Related
Mar 24, 2008
What are the basic processes for maintaining an SQL SERVER 2005 Database?
Is there a top ten list of things you should do periodically to all SQL SERVER 2005 databases?
View 2 Replies
View Related
Nov 19, 2007
In my scenario I have about a dozen of flat files (Text files),that I have to import in SQL Server 2005.
I am using Flat File connection manager to carry out tha task.Flat files contains data generated from oracle.
When I import data from these text files into SQL ,the main problem lies in converting number(p,s) data type column of Oracle(In text file) to numeric(p,s) data type of sql server 2005.
Number(p,s) data type looses all it's digits after decimal to zero during import process.
For example
1.2434234390 (from text file,number(p,s) type of oracle) converts to 12.0000000000 (numeric(p,s)) of sql server 2005.
Is this this any workaround to this problem.I urgently need help.
View 7 Replies
View Related
Feb 29, 2012
I deleted some records out of an entity, I'd like to keep the Codes as contiguous and incremental, meaning no breaks between the code numbers.I created a business rule and applied it but codes remain the same.
I used the "Default to a generated Value" action, then selected the Code attrib. --Saved.
Then back to the Entity, I applied business rules. But nothing seemed to have happened. As there was no change in codes.
There are no validation errors either.
View 2 Replies
View Related
Jan 28, 2008
I have just started using SQL Server reporting services and am stuck with creating subreports.
I have a added a sub report to the main report. When I right click on the sub report, go to properties -> Parameters, and click on the dropdown for Parameter Value, I see all Sum and Count fields but not the data fields.
For example, In the dropdownlist for the Parameter value, I see Sum(Fields!TASK_ID.Value, "AppTest"), Count(Fields!TASK_NAME.Value, "CammpTest") but not Fields!TASK_NAME.Value, Fields!TASK_ID.Value which are the fields retrieved from the dataset assigned to the subreport.
When I manually change the parameter value to Fields!TASK_ID.Value, and try to preview the report, I get Error: Subreport could not be shown. I have no idea what the underlying issue is but am guessing that it's because the field - Fields!TASK_ID.Value is not in the dropdown but am trying to link the main report and sub report with this field.
Am I missing something here? Any help is appreciated.
Thanks,
Sirisha
View 3 Replies
View Related
Feb 20, 2008
How can I create a Table whose one field will be 'tableid INT IDENTITY(1,1)' and other fields will be the fields from the table "ashu".
can this be possible in SQL Server without explicitly writing the"ashu" table's fields name.
View 8 Replies
View Related
Jan 26, 2008
sir
I have got this error message to establish connction with recordset vb .net, Can you please rectify this
Too many arguments to 'Public Overridable ReadOnly Default Property Fields() As ADODB.Fields'
my code like this
rs = New ADODB.Recordset
rs.Open("Select * from UserLogin where userid='" & txtUserName.Text & "'", gstrDB, DB.CursorTypeEnum.adOpenStatic)
If txtUserName.Text = rs.Fields.Append(userid) Then
MsgBox("OK", MsgBoxStyle.OKOnly, "Confirmation")
End If
thanks
View 1 Replies
View Related
Jul 2, 2007
here is my schema...
Board of Zoning Appeals
Parcel#
BZACase#
ApplicantID
OwnerID
DateFiled
Size
Zoning
VU (Variance of Use)
BZACase#
ProposedUse
Comments
VDS (Variance of Developmental Standard)
BZACase#
OrdinanceReq
RequestedDim
ProposedUse
Comments
SE (Special Exception)
BZACase#
CurrentUse
ProposedUse
OrdinanceReq
RequestedDim
Comments
Applicant
ApplicantID
FirstName
LastName
CompanyName
Line1
Line2
City
State
Zip
PhoneNum
Owner
OwnerID
FirstName
LastName
CompanyName
Line1
Line2
City
State
Zip
PhoneNum
Now i know what im doing with the applicantID and ownerID...but the BZAcase# is a number/unique identifier that looks like this....2007-VU-000, 2007-VU-001, 2007-VU-003....so my question is
1. how do i get the last three numbers to increment each time a new application is created?
2. how do i retrieve the last record in the table???
3. Do you have any other suggestions?? i have to have the number and what type of form they applied for in the "case#"???
View 11 Replies
View Related
Mar 26, 2003
I am requesting suggestions to solve my problem.
Background: We are changing the way we pay commissions to our rep groups. We used to pay when the order was placed, now we want to pay when the invoice is paid.
Problem: The commision information is currently stored in the customer order, not in the invoice. These orders get deleted a couple weeks after the order was completed (shipped).
I want to create another, rather dynamic, table/structure that will store the order number and the commission percentage.
This info in this table should:
Be deleted: if the order has been deleted and the invoice either does not exist or was payed some period of time ago (maybe 6 months)
Be updated: if the customer order has been updated (i.e. the commission was changed)
Be inserted: if the order exists but the order number is not in the new table.
That is it in a nutshell.
Thanks,
Brian
View 1 Replies
View Related
Sep 25, 2006
hi
i have written a procedure for stock report.
its working fine. please go through the sp and give me some Suggestions. please tell me where i need to improve my code. thanks
Note: User is required to execute this procedure daily.
i am taking the sum of issues,purchases,returns,physical adjustments for each and every product from last updated date to today's date and storing it in a table i,e stock_Dump. from this table i generate the date wise stock report
SET QUOTED_IDENTIFIER OFF
GO
SET ANSI_NULLS ON
GO
CREATE PROCEDURE dbo.spUpdateStock
@strReturn varchar(70) output
AS
BEGIN
declare @maxDt smalld atetime
if exists(Select * from Stock_Dump where Txn_Date=
Convert(varchar,Getdate(),101))
BEGIN
set @strReturn='Stock Table already generated
for the day. cannot generate it again'
END
ELSE
BEGIN
TRUNCATE TABLE Stock_Dump_Temp
select @maxDt=max(Txn_Date) from Stock_Dump
/* insert (Opening stock) Closing stock for all
all the products from last max Date*/
INSERT INTO Stock_Dump_Temp Select Product_code,
convert(varchar,GetDate(),101) as Txn_Date,
Closing_Stock as Opening_Stock ,
0,0,0,0,0,0,0 from Stock_Dump Where
Txn_Date=Convert(varchar,@maxDt,101)
/* Issues*/
INSERT INTO Stock_Dump_Temp Select Product_code,
convert(varchar,GetDate(),101) as Txn_Date,0,
Sum(Qty) as Issue_Qty,0,0,0,0,0,0 from Issue_Details
Where Issue_No IN(Select Issue_No from Issue_Hdr
Where Issue_Date > Convert(varchar,@maxDt,101) and
Issue_Date <= Convert(varchar,getdate(),101))
Group by Product_Code
/* Goods receipt*/
INSERT INTO Stock_Dump_Temp Select Product_code,
convert(varchar,GetDate(),101) as Txn_Date,0,0,
Sum(Qty) as Purchase,0,0,0,0,0 from Dlv_note_Details
Where Dlv_Note_No IN(Select Dlv_Note_No from
Dlv_Hdr Where Dlv_Note_Date > Convert(varchar,@maxDt,101) and
Dlv_Note_Date <= Convert(varchar,getdate(),101))
Group by Product_Code
/* Rejection after receipt*/
INSERT INTO Stock_Dump_Temp Select Product_code,
convert(varchar,GetDate(),101) as Txn_Date,0,0,
0,Sum(Qty) as Rejected,0,0,0,0 from
Rejection_Details Where Rejection_No IN
(Select Rejection_No from Rejection_Hdr Where
Rejection_Date > Convert(varchar,@maxDt,101) and
Rejection_Date <= Convert(varchar,getdate(),101))
Group by Product_Code
/* Issues returns*/
INSERT INTO Stock_Dump_Temp Select Product_code,
convert(varchar,GetDate(),101) as Txn_Date,0,0,
0,0,Sum(Qty) As Issue_Returns,0,0,0 from
Issue_Return_Details Where Issue_R_No
IN(Select Issue_R_No from Issue_Return_Hdr
Where Return_Date > Convert(varchar,@maxDt,101) and
Return_Date <= Convert(varchar,getdate(),101))
Group by Product_Code
/* Physical Stock + */
INSERT INTO Stock_Dump_Temp Select Product_code,
convert(varchar,GetDate(),101) as Txn_Date,0,0,
0,0,0,Sum(Var_Qty) as Phy_Qty_P,0,0 from
Physical_Details Where Var_Qty>0 and Txn_No
IN(Select txn_No from Physical_Hdr Where
Txn_Date > Convert(varchar,@maxDt,101) and
Txn_Date <= Convert(varchar,getdate(),101))
Group by Product_Code
/* Physical -*/
INSERT INTO Stock_Dump_Temp Select Product_code,
convert(varchar,GetDate(),101) as Txn_Date,0,0,
0,0,0,0,Sum(Var_Qty) as Phy_Qty_M,0 from
Physical_Details Where Var_Qty<0 and Txn_No
IN(Select txn_No from Physical_Hdr Where
Txn_Date > Convert(varchar,@maxDt,101) and
Txn_Date <= Convert(varchar,getdate(),101))
Group by Product_Code
/* insert all the records into actual table i,e Stock_dump from Stock_dump_temp (temporory table)*/
INSERT INTO Stock_Dump Select Product_code,Txn_Date,
Sum(Opening_Stock) as Opening_Stock,Sum(Issue_Qty) as
Issue_Qty,Sum(purchase) as Purchase,Sum(Rejected) as
Rejected,Sum(Issue_Returns) as Issue_returns,
Sum(Phy_Qty_P) as Phy_Qty_P,Sum(Phy_Qty_M) as
Phy_Qty_M,0 as Closing_Stock from Stock_Dump_Temp
Group By ProducT_Code,Txn_Date
/* update closing stock*/
UPDATE Stock_Dump Set
Closing_Stock=abs((Opening_Stock+Purchase+Issue_Returns+Phy_Qty_P)-(Issue_Qty+Rejected+Phy_Qty_M))
Where Txn_Date=Convert(varchar,Getdate(),101)
/* delete unwanted records */
DELETE From Stock_Dump Where Opening_Stock=0 and
Issue_Qty=0 and Purchase=0 and Rejected=0
and Issue_Returns=0 and Phy_Qty_M=0 and Phy_Qty_P=0
set @strReturn='Stock Table Update Successfully'
return
END
END
GO
SET QUOTED_IDENTIFIER OFF
GO
SET ANSI_NULLS ON
GO
suji
View 1 Replies
View Related
Nov 21, 2006
I have a database which contains more than 20000 stored procedureswhich were created withansi nulls off. This i found out using the querySELECT name,AnsiNullsOn FROM(SELECT name, OBJECTPROPERTY(id, 'ExecIsAnsiNullsOn') AS AnsiNullsOnFROM sysobjects WHERE type = 'P' ) A WHERE AnsiNullsOn=0Is there any way that i can set this property to 1 for all the storedprocedures i have??I know the alternate method is to drop the procedure and execute thescripts again with AnsiNullsOn = 1.Is there any other simple ways?? It will be very helpful for me..
View 2 Replies
View Related
Dec 8, 2006
I want to transform textfiles to sql server set based and not row based.what would be the best way to transfer.
let me know.
View 10 Replies
View Related
May 4, 2008
Hello all!
I have this simple sp.
SELECT VisName
FROM tblVis
WHERE (VisID = 1)
Now I have lets say VISID 1 to 50. I'm using this SP to change the text on a button. Now I have 50 buttons. So I run this SP, then I run this in my vb.net code
Code Snippet
Dim constr As New SqlConnection(PVDBConn)
Try
'Variable to hold the results
Dim results As String = String.Empty
cmdUpd = New SqlCommand("SelVis1Name", constr)
cmdUpd.CommandType = CommandType.StoredProcedure
constr.Open()
'Set results to the value returned from ExecuteScalar()
results = CType(cmdUpd.ExecuteScalar(), String)
constr.Close()
'Set our buttons text to that value
Button1.Text = results
Catch ex As Exception
MsgBox(ex.Message.ToString)
End Try
At any time, when I start my program, I may need to label 10 buttons, or up to 50. Now I will have this number in a text file. Can I grab that number from a text file, and pass it into a SP?
And can I write this SP only once, to work for more than one label per time. Or do I have to write this sp 50 times?
TIA!
Rudy
View 5 Replies
View Related