The reason it is poorly designed is the table is used to hold questions and answers, all with a 1:1 relationship. Instead of having ID, ProductType, Question, Answer they have unfortunately adopted the approach of the above i.e:
id 1
thisID 3
parentid nuLL
DESCRIPTION: this is a question
id 20
thisID 3_1
parentID 3
DESCRIPTION: this is the answer to the question above
So I am writing a sproc that does this using a temp table. I got this far:
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
-- =============================================
-- Author:Spencer
-- =============================================
ALTER PROCEDURE [dbo].[GetFAQs]
-- Add the parameters for the stored procedure here
@ProductType varchar(255)
AS
BEGIN
SET NOCOUNT ON;
-- Insert statements for procedure here
CREATE TABLE TEMP
(
IDINT,
thisIDvarchar(50) null,
parentIDvarchar(50) null,
Titlevarchar(255) null,
DescriptionQvarchar(8000) null,
DescriptionAvarchar(8000) null,
ProductTypevarchar(255) null,
)
SELECT
ID,
thisID,
parentID,
Title,
DescriptionQ,
DescriptionA,
ProductType
FROM
A2Z
WHERE
ProductType = @ProductType AND parentID IS NULL
END
GO
This gets all my questions for that product type.
What I need to do is load the questions into my temp table and then run through the a2z table again gaining the answers to the questions (the parentid holds the question ID). The answers then will also get loaded into the temp table.
I have problem in loading multiple excel sheets data in to according to that excelsheets tables in a DB. All the excel sheets are in a folder,from that folder i have to acces all excel sheets. For this i am unsing script task and one dataflow task. But the error is coming in script task i am not able to put the path in the script..
Is this the correct way to do like this? Or any other way?
Can u please tell me the solution for this..Thanks in advance who are responding to this mail...
I have the following query and I would like to combine it into one query instead of using several temp tables.
IF OBJECT_ID('Tempdb..#a') IS NOT NULL DROP TABLE #a SELECT * INTO #a FROM #Web_table w WITH(NOLOCK) WHERE w.generated_date IS NOT NULL IF OBJECT_ID('Tempdb..#b') IS NOT NULL DROP TABLE #b
I am trying load data from multiple Foxpro tables which are under a folder. I can have multiple folders with 17 foxpro tables. I was able to do it in DTS using ActiveX script. Here is the ACtiveX script.
'********************************************************************** ' Visual Basic ActiveX Script '************************************************************************ Option Explicit Dim conObj,DSNGosfbill,comObj,objRs,HostServer Dim sFolder,sFileFolder, Details,subFolderoccur,sFileFolderDBF,sFileFolderFPT,CheckFile,dFiles,Fil Dim fso, folderObj,subFolderList,dFolderObj Dim objPackage,oStep,objPackage_1,oStep_1,ConnObj_001,ConnObj_004,ConnObj_031,ConnObj_032,ConnObj_033 Dim ConnObj_Hclaimb, ConnObj_HProv, ConnObj_Hids, ConnObj_HCodes, ConnObj_HSpan, ConnObj_002, ConnObj_HCHGB Set conObj = CreateObject("ADODB.Connection") HostServer =DTSGlobalVariables("gvServer").Value
set comObj=CreateObject ("adodb.command") set comObj.ActiveConnection =conObj Function Main() Dim Dir_Name,DirFlag Dir_Name = "" DirFlag = "N" Set fso = CreateObject("Scripting.FileSystemObject")
If fso.FileExists(checkFile) Then Else Details = "***** Success.Lst file is missing in Batch folder. BATCH job may not be successfull or there are no folders in UNZIP directory to process. Check the batch run.*****" Call Write_Log Main = DTSTaskExecResult_Failure Exit Function End If Set folderObj = fso.GetFolder(sFolder) Set subFolderList = folderObj.SubFolders For Each subFolderOccur in subFolderList DirFlag = "Y" Dir_Name = subFolderOccur.Name Call Process_Dir(1,subFolderOccur.Name) Next If DirFlag = "N" Then Details = "***** No directories to process in SSI UNZIP folder*****" Call Write_Log End If If DirFlag = "Y" Then Call Process_Dir(2,Dir_Name) If objRs.Eof Then Details = "***** No directories to process in SSI UNZIP folder*****" Call Write_Log End If While not objRs.EOF set sFileFolder = fso.GetFolder(sFolder & objRs("zip_file_name")) Details = "***** Start-Time " & sFileFolder & " " & Date & " " & Time & "*****" Call Write_Log Call Update_Process_Flag("L",objRs("zip_file_name")) '*******Execute the package for each directory****************' '********* Call the Package**************' Set objPackage = CreateObject("DTS.Package") Set objPackage_1 = CreateObject("DTS.Package")
Set ConnObj_001 = objPackage.Connections("SSIPATH001") ConnObj_001.DataSource = sFileFolder
Set ConnObj_002 = objPackage.Connections("SSIPATH002") ConnObj_002.DataSource = sFileFolder
Set ConnObj_004 = objPackage.Connections("SSIPATH004") ConnObj_004.DataSource = sFileFolder Set ConnObj_031 = objPackage.Connections("SSIPATH031") ConnObj_031.DataSource = sFileFolder Set ConnObj_032 = objPackage.Connections("SSIPATH032") ConnObj_032.DataSource = sFileFolder Set ConnObj_033 = objPackage.Connections("SSIPATH033") ConnObj_033.DataSource = sFileFolder
Set ConnObj_Hclaimb = objPackage.Connections("SSIPATHCLAIMB") ConnObj_Hclaimb.DataSource = sFileFolder
Set ConnObj_HProv = objPackage.Connections("SSIPATHPROV") ConnObj_HProv.DataSource = sFileFolder Set ConnObj_Hids = objPackage.Connections("SSIPATHHIDS") ConnObj_Hids.DataSource = sFileFolder Set ConnObj_HCodes = objPackage.Connections("SSIPATHCODES") ConnObj_HCodes.DataSource = sFileFolder Set ConnObj_HSpan = objPackage.Connections("SSIPATHSPAN") ConnObj_HSpan.DataSource = sFileFolder
Set ConnObj_HCHGB = objPackage.Connections("SSIPATHCHGB") ConnObj_HCHGB.DataSource = sFileFolder
objPackage.Execute For Each oStep In objPackage.Steps If oStep.ExecutionResult = DTSStepExecResult_Failure Then Details = "***** GOSFBILL_SSI_Staging_Load failed. " & Date & " " & Time & "*****" Call Write_Log Main = DTSTaskExecResult_Failure Exit Function End If Next
For Each oStep_1 In objPackage_1.Steps If oStep_1.ExecutionResult = DTSStepExecResult_Failure Then
Details = "***** GOSFBILL_SSI_Update_FileSource failed. " & Date & " " & Time & "*****" Call Write_Log Main = DTSTaskExecResult_Failure Exit Function End If Next
'********************************************' Details = "***** End-Time " & sFileFolder & " " & Date & " " & Time & "*****" Call Write_Log objPackage.Uninitialize objPackage_1.Uninitialize Set objPackage = Nothing Set objPackage_1 = Nothing sFileFolder = "" sFileFolderDBF = "" sFileFolderFPT = "" objRs.MoveNext Wend objRs.Close End If Call Close_Conn Main = DTSTaskExecResult_Success End Function Sub Process_Dir (Para_cntl,Dir_Name) comObj.CommandText ="dbo.Usp_Process_Dir" comObj.commandtype = 4 comobj.parameters.Refresh comobj.parameters("@Para_Cntl")= para_cntl comobj.parameters("@Dir_Nm")= Dir_Name comobj.parameters("@File_Type")= "SSI" If (Para_Cntl = 1)Then comObj.Execute() Else If Para_Cntl = 2 Then Set objRs = comObj.Execute() End If End If
End Sub Sub Update_Process_Flag(P_Flag,Dir_Name) comObj.CommandText ="dbo.Usp_Process_Flag" comObj.commandtype = 4 comObj.parameters.Refresh comObj.parameters("@Process_Flag")= P_Flag comobj.parameters("@Dir_Nm")= Dir_Name comObj.Execute() End Sub Sub Write_Log comObj.CommandText ="dbo.usp_etl_write_log" comObj.commandtype = 4 comobj.parameters.Refresh comobj.parameters("@Text")= Details Comobj.parameters("@NDC_SSI_IND")= "SSI" Comobj.parameters("@Process_Stage")= "Staging" comObj.Execute() End Sub
Sub Close_Conn Set comObj = Nothing Set objRs = Nothing conObj.Close Set conObj = Nothing Set fso = Nothing Set folderObj = Nothing Set subFolderList = Nothing End Sub
When I migrated this code to SSIS, its not working. How can I achive this functionality in SSIS. Any one pls help me.
What's the best way to go about inserting data from several tables that all contain the same type of data I want to store (employeeID, employerID, date.. etc) into a temp table based on a select query that filters each table's data?
I have multiple xml data file in a directory say C:XMLData abc1.xml, abc2.xml, abc3.xml etc.
Need to loop through each file in ssis with Foreach loop container, and get the file name say abc1, and load the data of abc1.xml to abc1 table in sql server DB.
Next iteration will pick up the abc2.xml and find the abc2 table in sql server DB then insert the data in abc2 table.
While each iteration, xml source should also point each xsd file correspondingly.
 Tables are already created in DB
I solved my problem up to getting the file name from ech iteration and assigned file name to variable, in oledb destination data access mode I select Table or view name variable, then corresponding table will get selected for data insertation.
Just wanted to know how can I read each xsd file for each xml data files while iteration.Â
I am writing a stored procedure to pull records from a table of personal data related to jobs applied for.
There are 3 tables involved, the jobs, the applicants and the applications.
I need to search on job title, job ref from the jobs table and on forename, surname and applicant ID from the applicants table.
There are some quirks here, if the user enters an applicant ID then we can simply scan the jobs table for that id where it also matches the job title wildcards, so that is quite easy to manage.
My own idea for the more complicated searches was to gather the unique ID's from the jobs table into a var, then do similar with the applicants and then search the applications table where both these ID's matched? I think that wouldn't work so well if using the 'WHERE IN()' clause for the main query?
So what approach would be best here to perform the second part of the SP if the client hasn't passed an ApplicantID?
Obviously the applications table has both JobID and ClientID's to relate back to the applicants table.
So can anyone help here, it seems a fairly complicated statement or set of statements would be required here.
Looking at BOL for temp tables help, I discover that a local temp table (I want to only have life within my stored proc) SHOULD be visible to all (child) stored procs called by the papa stored proc.
However, the following code works just peachy when I use a GLOBAL temp table (i.e., ##MyTempTbl) but fails when I use a local temp table (i.e., #MyTempTable). Through trial and error, and careful weeding efforts, I know that the error I get on the local version is coming from the xp_sendmail call. The error I get is: ODBC error 208 (42S02) Invalid object name '#MyTempTbl'.
Here is the code that works:SET NOCOUNT ON
CREATE TABLE ##MyTempTbl (SeqNo int identity, MyWords varchar(1000)) INSERT ##MyTempTbl values ('Put your long message here.') INSERT ##MyTempTbl values ('Put your second long message here.') INSERT ##MyTempTbl values ('put your really, really LONG message (yeah, every guy says his message is the longest...whatever!') DECLARE @cmd varchar(256) DECLARE @LargestEventSize int DECLARE @Width int, @Msg varchar(128) SELECT @LargestEventSize = Max(Len(MyWords)) FROM ##MyTempTbl
SET @cmd = 'SELECT Cast(MyWords AS varchar(' + CONVERT(varchar(5), @LargestEventSize) + ')) FROM ##MyTempTbl order by SeqNo' SET @Width = @LargestEventSize + 1 SET @Msg = 'Here is the junk you asked about' + CHAR(13) + '----------------------------' EXECUTE Master.dbo.xp_sendmail 'YoMama@WhoKnows.com', @query = @cmd, @no_header= 'TRUE', @width = @Width, @dbuse = 'MyDB', @subject='none of your darn business', @message= @Msg DROP TABLE ##MyTempTbl
The only thing I change to make it fail is the table name, change it from ##MyTempTbl to #MyTempTbl, and it dashes the email hopes of the stored procedure upon the jagged rocks of electronic despair.
Any insight anyone? Or is BOL just full of...well..."stuff"?
I have an application that I am working on that uses some small temptables. I am considering moving them to Table Variables - Would thisbe a performance enhancement?Some background information: The system I am working on has numeroustables but for this exercise there are only three that really matter.Claim, Transaction and Parties.A Claim can have 0 or more transactions.A Claim can have 1 or more parties.A Transaction can have 1 or more parties.A party can have 1 or more claim.A party can have 1 or more transactions. Parties are really many tomany back to Claim and transaction tables.I have three stored procsinsertClaiminsertTransactioninsertPartiesFrom an xml point of view the data looks like this<claim><parties><info />insertClaim takes 3 sets of paramters - All the claim levelinformation (as individual parameters), All the parties on a claim (asone xml parameter), All the transactions on a claim(As one xmlparameter with Parties as part of the xml)insertClaim calls insertParties and passes in the parties xml -insertParties returns a recordset of the newly inserted records.insertClaim then uses that table to join the claim to the parties. Itthen calls insertTransaction and passes the transaction xml into thatsproc.insertTransaciton then inserts the transactions in the xml, and alsocalls insertParties, passing in the XML snippet
I have 3 Checkbox list panels that query the DB for the items. Panel nÂș 2 and 3 need to know selection on panel nÂș 1. Panels have multiple item selection. Multiple users may use this at the same time and I wanted to have a full separation between the application and the DB. The ASP.net application always uses Stored Procedures to access the DB. Whats the best course of action? Using a permanent 'temp' table on the SQL server? Accomplish everything on the client side?
[Web application being built on ASP.net 3.5 (IIS7) connected to SQL Server 2005)
Hi! I have a general SQL CE v3.5 design question related to table/file layout. I have an system that has multiple tables that fall into categories of data access. The 3 categories of data access are:
1 is for configuration-related data. There is one application that will read/write to the data, and a second application that will read the data on startup.
1 is for high-performance temporal storage of data. The data objects are all the same type, but they are our own custom object and not just simple types.
1 is for logging where the data will be permanent - unless the configured size/recycling settings cause a resize or cleanup. There will be one application writing alot [potentially] of data depending on log settings, and another application searching/reading sections of data. When working with data and designing the layout, I like to approach things from a data-centric mindset, because this seems to result in a better performing system. That said, I am thinking about using 3 individual SDF files for the above data access scenarios - as opposed to a single SDF with multiple tables. I'm thinking this would provide better performance in SQL CE because the query engine will not have alot of different types of queries going against the same database file. For instance, the temporal storage is basically reading/writing/deleting various amounts of data. And, this is different from the logging, where the log can grow pretty large - definitely bigger than the default 128 MB. So, it seems logical to manage them separately.
I would greatly appreciate any suggestions from the SQL CE experts with regard to my approach. If there are any tips/tricks with respect to different data access scenarios - taking into account performance, type of data access, etc. - I would love to take a look at that.
WE have a job that loads data from an Oralce DB into our SQL Server 2000 DB twice a day. The schedule has just changed so that now there is a possibility of having my west coast users impacted when it runs at 5 PM PST and my east coast users impacted when it runs at 7 AM EST. As a workaround, I have developed a DTS package that loads the data into temp tables instead of the real tables. IE. Oracle -> XTable_temp instead of Oracle -> XTable. The load sometimes takes about an hour to an hour and a half to load, so this solution works great, but I want to then lock the table, delete it and rename the temp table to table X. The pseudo code would be:
Begin Transaction
Lock Table XTable
Drop XTable
Alter Table XTable_temp rename to XTable
Release Lock XTable
End Transaction
Create XTable_temp
I see two issues with this solution. 1) I think if I can lock XTable that the lock would be released when the table is dropped and the XTable_temp was being renamed. 2) I can't find a command to rename a table.
Is it possible to take a text file that contains multiple record types through the Data Transformation Service in MS SQL 7.0 and load each different record type into a seperate table?
I'm using the For Each loop container to load multiple XML data files into SQL Server, and noticing some peculiar behavior and need some advice.
The pattern I'm trying to accomplish is this: Iterate over a collection of XML files in a specific folder, loading each in turn into SQL Server. If the file has already been loaded, delete the records first before the load. After the load succeeds, move the file into an Archive folder.
To accomplish this, I've set up a For Eac Loop container using the For Each File enumerator, and retrieve just the file name and extension into a variable. The first task in this is an Execute SQL task that uses a SQL DML statement to delete records based on a field in the table containing the file name (DELETE FROM table WHERE PROG_NAME = ?), and map a variable to the parameter. The next task is the data flow task that uses an XML source using the variable as the file name, and SQL Server destination. I use a derived column task in between to plug the variable holding the file name into the PROG_NAME field. So far, so good. This works.
But now comes the peculiar part. I initially had the XSD files in the same folder as the XML files, but wanted to put them in their own directory, so moved them, and made the change to the XML source adapter for the new path to the XSD file. The next time I ran my package, it failed. For some reason, as the For Each Loop tried to iterate over the directory, it was using the XSD path assigned in the XML source instead of the path for the XML files. Unusual...
My question is, why when choosing the File name & Extension retrieval type (as opposed to the fully qualified name) will the task try to use the XSD location to find the files? Is my variable getting reassigned somewhere?
We are trying to load flat text files with upwards of 7 million records into a table on SQL. The table has a clustered index on 3 fields. We are sometimes able to complete smaller tables (500,000-750,000 records) and build the indexes prior to importing the data, however when we try the larger tables an error occurs :
Error at Destination for row number 6785496. Errors encountered so far in this task: 1
Location: somerge.c:1573 Expression: mrP->mrStatus!=MERGERUN::NONE SPID: 11 Process ID: 173
None of the recods end up importing. The row number it gives is always the total number of records that was in the text file I was trying to import. I tried to import the text files first and then build the clustered indexes but a table with only 300,000 records ran for nearly 4 days without completing before we killed it.
I was wondering if it is possible to assign values to multiple variables from within the same execute sql task, ie I want to use only one execute sql task and have multiple T-SQL statements within it and then assign the results of these sql statemenst as values to multiple variable.
Typically I would declare variables var1 , var2 and var3 , then can I just add one execute sql task and have mutiple sql statements within it? something like this
I have got an xml file with size more than 2 GB. I have to load this file into tables. With 32 bit platform, I am unable to load this file using SSIS. Ram is 8 GB, but it is still bombing out. As I know it uses XML DOM Parser and tries to shred the file in memory and because of memory limition, it fails. Although I have already written code in C# using XmlTextReading object(implemetation of SAX Parser) to load data in tables, but I want to keep this loading process within the limits of DBAs.
I am stuck. Can someone guide me through the situation?
I'm having a very irritating time trying to migrate data from a COBOL system to SQL Server.
One of the A/R Master files has approx. 200 columns.
I can export this file any number of ways that will (sometimes) load partially into my database, but always when the load succeeds, columns 75 through N simply contain NULL, even though there is data in the file. When the load fails in DTS, the error is always missing column delimiter. Using BULK INSERT the error is always something like data too long at column 75.
Putting all this together, I have deduced that something isn't working if I try to load a staging table with more than 74 columns of data. This seems like a way-too-low threshold for a robust database, especially since SQL Server is supposed to be able to handle up to 1,024 columns per table.
I am having trouble loading tables (within the same data flow) that have a foriegn key relationship defined between them. For instance:
Table A is a parent (one side of the relationship) to Table B (many side of the relationship).
I am trying to load Table A first within the data flow and then Table B after, but I get the following error:
[OCMD EntityRole Insert [2666]] Error: An OLE DB error has occurred. Error code: 0x80040E2F. An OLE DB record is available. Source: "Microsoft SQL Native Client" Hresult: 0x80040E2F Description: "The statement has been terminated.". An OLE DB record is available. Source: "Microsoft SQL Native Client" Hresult: 0x80040E2F Description: "The INSERT statement conflicted with the FOREIGN KEY constraint "FK_EntityRole_Entity". The conflict occurred in database "ODS", table "dbo.Entity", column 'EntityGuid'.".
I am currently using OLE DB commands to perform the inserts, I load table A and move on to then load Table B, I can see the records in Table A before trying to load Table B but for some reason Table B load still fails.
I was thinking maybe this has something to do with the transaction setting or Isolation level but have played with this to no avail (currently everything is the default - supported/serializable). Also, I am thinking maybe because the OLE DB commands are creating two seperate connections (they are using the same connection manager) the second one is unable to see the transactions from the other (first) connection (Table A)?
Is there a way around this without dropping (disabling) forigen keys before the load and adding them back in after? Would like to avoid this?
I would also like to avoid reading the data source multiple times. Everything I need is in the one source so I would like to populate multiple tables from the one source data stream instead of reading the same data 2,3 or 4 times etc.
Seems to me there must be a simple explanation/solution for this but I'm stuck at this point?
P.S.
I was intially using OLE DB destinations (because they are much faster) and was having the same issue, which made sense because the OLE DB destinations do not let you pass the data stream on so I had to multi cast to the destinations so they were loading at the same time. I would rather use the OLE DB destinations so if you have any ideas around how I could do this using those components that would be appreciated too!
I am trying to do here is to load different flat files to different tables: For example, if the file name starts with "enrollment", then it goes to table "enrollment" table; if the file name starts with "student", then it goes to "student" table.
For now, I created a foreach loop container for the each different files. So it ended up using several foreach loop containers. I am wondering if there is a way just to use one foreach loop containters to process the loading.
Anyone shed some light on this?? Thank you very much for your help!
I am attempting to execute a stored procedure as the sql query for a data transformation from sql into an excel file. The stored procedure I am calling uses temp tables (#tempT1, #tempT2, etc.) to gather results from various calculations. When I try to execute this sp, I get 'Error Source: Microsoft OLE DB Provider for SQL Server Error Description: Invalid Object name "#tempT1"'
Is there a way to make a DTS package call a stored procedure that uses temp tables?
I want to check to see if a temporary table exists before I try creating one but I can't seem to find which sys table or schema collection I check. Any ideas?
-- Add Structural data usp_AddStructural @iEstimateID, 1, 'Structural' usp_AddForming @iEstimateID, 2, 'Forming' ... ... ... GO
Now, a couple of problems, after the table is created and populated, I cannot find it in my list of tables, even after "refreshing".
I checked to ensure that it exists using the query analyzer and it does so I know the table is being created.
Also, I cannot see the table using crystal reports, connecting etc...... Can I not access a temporary table from 3rd party applications? I have crystal reports 7.0 professional.
I am in the process of modifying some stored procedures that currently do not use temp tables. For this modification I am required to make the stored procedures use temp tables. There are several UDF's within this stored procedure that will need to use the temp tables, and this is where in lies the problem. Does anyone know of a work around that would allow UDF's to use temp tables, or does anyone know of alternate methods instead of temp tables that wouldn't involve too much change?
I have a called stored procedure which creates a bunch of temporary tables, inserts data into them, indexes the tables and then returns to the main calling SP. In the main stored procedure I then do SELECTs from the temporary tables. My problem is I keep getting invalid object errors on the temporary tables: Invalid object name '#temp_table1'
The stored procedure is in a test environment. In the SELECT I tried a prefix of database owner (my logon) as well as "dbo." but still get the error. Any suggestions as to what I am doing wrong would be much appreciated.
In these two tables im just to bring the data back where the two DesignID's dont match. Im gettin an error
Server: Msg 107, Level 16, State 3, Line 1 The column prefix '#ttTopSellers' does not match with a table name or alias name used in the query.
Declare @CustomerID as VARCHAR(25) Set @CustomerID = 'DELCOZ01-10'
/*Figure the designs that stores carry*/ Select Design.Description, Item.DesignID, CustomerClassificationID, CustomerID, Region.[ID] as RegionID, Region.Name Into #ttDesign From Mas.dbo.Item Item Inner Join MAS.dbo.Style Style on Item.StyleID = Style.[ID] Inner Join MAS.dbo.Line Line on Style.LineID = Line.[ID] Inner Join MAS.dbo.Design Design on Item.DesignID = Design.[ID] Inner Join Mas.dbo.DesignRegionIndex DRI on Design.[ID] = DRI.DesignID Inner Join MAS.dbo.Region Region on DRI.RegionID = Region.[ID] Inner Join MAS.dbo.CustomerClassificationRegionIndex CRI on Region.[ID] = CRI.RegionID Inner Join MAS.dbo.CustomerClassification CC on CRI.CustomerClassificationID = CC.[ID]
Where @CustomerID = CustomerID Group By Design.Description, Item.DesignID, CustomerClassificationID, CustomerID, Region.[ID], Region.Name
/*This finds the top retail sales globally*/ Select Top 10 Sum(Sales) as Sales, DesignID, Design.[Description] Into #ttTopSellers From Reporting.dbo.RetailSales_ByStore_ByCustomer_ByDay_ByItem DI Inner Join Mas.dbo.Item Item on DI.ItemNumber = Item.ItemNumber Inner Join MAS.dbo.Style Style on Item.StyleID = Style.[ID] Inner Join MAS.dbo.Line Line on Style.LineID = Line.[ID] Inner Join MAS.dbo.Design Design on Item.DesignID = Design.[ID] Where [Date] >= Month(getdate())-12 and DesignID <> 0 Group By DesignID, Design.[Description] Order by Sum(Sales) Desc
Select * From #ttDesign Where #ttDesign.DesignID <> #ttTopSellers.DesignID
Why cant I use the same temptable name i a stored procedure after i have droped it?I use the Pubs database for the test case.CREATE PROCEDURE spFulltUttrekk ASSELECT *INTO #tempFROM JobsSELECT *FROM #tempDROP TABLE #tempSELECT *INTO #tempFROM EmployeeSELECT *FROM #temp