C Part Running Out Of Room, Large MDF, LDF Files
Sep 20, 2005
Hi my data files sit in the default directories and I think they are causing my partition to run out of space. I mainly use one db that I created but don't use the others (ie master, model, tempdb, etc). Yet I see their MDF and LDF files are growing. What can I do to shrink them down or perhaps move them off to a larger partition after shrinking?
View 6 Replies
ADVERTISEMENT
May 16, 2000
I am trying to reindex a large table, and cannot because there isn't enough room on the the primary filegroup. the database consistes of one physical file in the primary filegroup. the table is over 50% of the size of the database. When the table is less than 50% of the size of the database, I do not see this problem.
BTW, the only index on the table is the primary key which consists of two columns, one is an integer and the other datetime.
It seems as if SQL server needs 1x the current size of the table to be free in order to reindex? Is this the case?
It is not an option for me to allow the database to autogrow. Is there anything else I can do?
Thanks for your help!
View 3 Replies
View Related
May 28, 2015
In my environment, there is maintenance plan configured on one of the server and while running DBCC checkdb on a database of size around 200GB, log file usage of tempdb is increasing and causing the maintenance job to fail.
What can I do to make the maintenance job run successfully, size of the tempdb database is only 50GB and recovery model is set to simple. It cannot be increased as the mount point on which it is residing is 50GB.
View 3 Replies
View Related
Dec 10, 2001
I am calling a SQL Server 6.5 Stored Procedure from Access 2000 with the following code :-
Public Function CheckDigitCalc()
Dim conn As New ADODB.Connection
Dim cmd As New ADODB.Command
On Error GoTo TryAgain
conn.Open "DSN=WEB;uid=sa;pwd=;DATABASE=WEB;"
Set cmd.ActiveConnection = conn
cmd.CommandText = "SPtest2"
cmd.CommandType = adCmdStoredProc
cmd.Execute
MsgBox "Numbers created OK.", vbOKOnly
Exit Function
TryAgain:
MsgBox "Error occurred, see details below :-" & vbCrLf & vbCrLf & _
Err & vbCrLf & vbCrLf & _
Error & vbCrLf & vbCrLf, 48, "Error"
End Function
The MsgBox pops up indicating that the Stored Procedure has run, and there are no errors produced by either SQL Server or Access. However, when I inspect the results of the Stored Procedure, it has not processed all the records it should have. It appears to stop processing after between 6 and 11 records out of a total of 50. The wierd thing is that if I execute the procedure on the server manually, it works perfectly. HELP ME IF U CAN ! THANKS.
View 2 Replies
View Related
Nov 27, 2007
Hi,
I had problem when combining OpenRowSet and SP_EXECUTESQL, when i tried to run the following query, it complaints that RESID is not declared. any idea how should i put the query so i will pass @RESID as 1 of the parameter? BTW, i know that the SP_EXECUTESQL is able to run query which length up to 8000, but how about the parameter?
DECLARE @SqlToRun NVARCHAR(MAX)
DECLARE @RESOURCEIDS NVARCHAR(MAX)
SET @RESOURCEIDS = REPLICATE(CAST('ABC' AS NVARCHAR(MAX)), 1000)
SET @SqlToRun = N'SELECT * FROM
OPENROWSET(''SQLNCLI'',
''server=;database=;uid=;pwd='',
''SELECT * FROM WHERE COL=@RESID'')'
DECLARE @PARAMS NVARCHAR(MAX)
SET @PARAMS = ''@RESID NVARCHAR(MAX)'
EXECUTE SP_EXECUTESQL @SqlToRun, @PARAMS, @RESID = @RESOURCEIDS
Regards,
Derek
View 5 Replies
View Related
Mar 20, 2008
The query show below is designed to use seasonal profiles to compute 53 weeks of forecast data and then from that compute the number of weeks of supply of each item at each location. The query works but the volume of data produced (20+M rows) is substantial. If I limit the CTE to a single location, it run is 2 seconds and returns 41,000 rows. But when run for all locations and items, it runs for more than 4 hours. Would I do better converting the CTE to a sub-query and adding an index to improve the performance of the main query?
WITH Forecast AS
(SELECT Location_Idx
,Item_Idx
,Week_Code
,(CAST(AnnualQty AS DECIMAL(9))/53.0)*[Profile] AS fcst
FROM dbo.FactReplenishmentProfile rp
INNER JOIN dbo.FactSeasonalProfile sp
ON sp.SeasonalProfile_Idx = rp.SeasonalProfile_Idx
)
SELECT fcst1.Location_Idx
,fcst1.Item_Idx
,fcst1.Week_Code
,fcst1.fcst AS WeekQty
,SUM(fcst2.fcst) AS CumQty
FROM Forecast fcst1
INNER JOIN Forecast fcst2
ON fcst2.Location_Idx = fcst1.Location_Idx
AND fcst2.Item_Idx = fcst1.Item_Idx
AND fcst2.Week_code <= fcst1.Week_Code
GROUP BY fcst1.Location_Idx,fcst1.Item_Idx,fcst1.Week_code,fcst1.fcst
View 4 Replies
View Related
Oct 20, 2000
Hi,
I have inherited some databases whith extremely large Log files.
I tried the truncate transaction log but did not work.
Can some body please tell me how to truncate these log files.
Thanks in advance.
Attaullah
View 2 Replies
View Related
Dec 5, 2000
I have some Large flat fiiles that I need to export to my SQL Server database. The file sizes range from 16 MB to 116 MB. I've tried to save the files to an excel sread sheet and then export them in that format, but that didn't work. does anyone have any suggestions?
View 1 Replies
View Related
Apr 4, 2007
i have a few tables using Sql Server 2005 Express. currently they are holding roughly 30-40k records in them. i have my log files set at restricted growth to 90 megs. while im not close to reaching that, i would like my tables to be able to scale up to possibly millions of records. based on that, i figure the transaction log file will prolly need to have a higher threshold (unrestricted growth). for those with experience, for tables that have millions of records, what are the average size log files i could expect.
is it a bad idea to just shrink the log file every night during off peak hours so that regardless of the amount of records i have, ill always start the day with a minimal log file?
do large log files have any effect on SQL performance?
View 3 Replies
View Related
Aug 16, 2006
We have SQL Server running on a Windows 2003 server, only because Backup Exec requires it. AT the location : C:Program FilesMicrosoft SQL ServerMSSQLData
there is this file: SuperVISorNet_log.LDF which is 15 Gb and is accessed daily. I apologize because I don't know what this is!
My question is: can this file be 'pruned' (for want of a better word) because it's taking up a lot of backup space.
View 17 Replies
View Related
Jan 25, 2008
I am trying to run a query that deletes duplicates records on a table with 24m records. The problem is each time I run it the log file fills up and I get an error saying the log file is full. For this reason the query never ends.
Is there anyway to turn of logging when running a query?
I think it also has to do with disk drive runng out of space as the log file is growing to over 12gb.
It is running in simple mode already.
View 11 Replies
View Related
Jul 17, 2007
While running query below on SQL Server 2005 (Build 3790: Service Pack 2):
SELECT DISTINCT pkg.PrimaryBarcode
FROM dbo.Package AS pkg (NOLOCK)
JOIN dbo.PackageCycle AS pc (NOLOCK)
ON pkg.PackageKey = pc.PackageKey
WHERE (pkg.BillCycleDateKey >= 20061201) OR
(pkg.BillStatusKey = 1)
I received a partial result set followed by
An error occurred while executing batch. Error message is: Couldn't replace text
I suspect this is a memory issue, but cannot find any reference to this particular msg on the Microsoft forums or the other 3rd party forums. PrimaryBarcode is a varchar(50)
I am not sure where to go from here. I would appreciate any ideas. Thanks in advance.
Cheers,
Mike Byrd
View 2 Replies
View Related
Jun 11, 2008
Hello,
I have decided to use Linq for my current ASP.NET project and so far it has been good, but now I am implementing a system that will allow users to upload binary content such as pictures and videos. For ease of management and security, I have decided to store this content directly in the database. The performance hit is a minor concern because very few user-uploaded images/videos will be seen on any given page (usually just one).
From the limited tutorials I have seen on the internet, Linq supports the SQL Server varbinary column through its System.Linq.Binary class. This class does not appear to support STREAMS and instead opts to load all of the contents into memory. This content can then be converted to an array of bytes, which can then be output to the browser via the response stream. This is not good. What if I am sending a video that is very large? Varbinary supports up to 2 GB. I can't have a 2 GB video sitting in memory. It makes a lot more sense to stream it via a small buffer.
Obviously, I am going to limit the size of the content that users can upload, but the core problem remains. If I limit content size to 2 MB and I have 2 GB of memory on the server, then I can only serve 1000 users concurrently. In reality, that number would be much less because of other processes running on the server.
Is there no way to stream data from a varbinary column with Linq using a small buffer of bytes?
Do I need to implement some custom logic on my Linq classes? Since these classes are automatically generated, how would I do such a thing?
Thanks.
View 1 Replies
View Related
Apr 9, 2008
how do i insert a large chunk of text into a table column. my project is to build a news website. where people can go and read news articles. the articles are provided by the author in word format, so how do i insert that news article into the table's column? any help would be appreciated
thanks
View 2 Replies
View Related
Feb 9, 2006
I have a table that I'm inserting a file into and using the Image data type to store the binary object. Now the code below works fine for files around 1.5 MB, but anything larger and it's like the code won't even execute and I get a Page Not found error.
I'm in the process of running some traces to find out what's going on in the backend, but I'm assuming there's something amiss with my code. The Image data type should handle files that size with no problem but for some reason it isn't.
Does anyone see anything wrong?
Thanks
Dim iLength As Integer = CType(File1.PostedFile.InputStream.Length, Integer)
If iLength = 0 Then Exit Sub 'not a valid file
Dim sContentType As String = File1.PostedFile.ContentType
Dim sFileName As String, i As Integer
Dim bytContent As Byte()
ReDim bytContent(iLength) 'byte array, set to file size
'strip the path off the filename
i = InStrRev(File1.PostedFile.FileName.Trim, "")
If i = 0 Then
sFileName = File1.PostedFile.FileName.Trim
Else
sFileName = Right(File1.PostedFile.FileName.Trim, Len(File1.PostedFile.FileName.Trim) - i)
End If
conn = New SqlConnection(eco)
conn.Open()
cmd = New SqlCommand("INSERT INTO ECO_Attachments (ECOID, FromType, DocName,OldRev,NewRev,NtLogin,DisplayName, FileName, FileSize, FileData, ContentType) VALUES (@ECOID, @FromType,@DocName,@OldRev,@NewRev,@NtLogin,@DisplayName, @FileName, @FileSize, @FileData, @ContentType) ")
cmd.Connection = conn
Try
File1.PostedFile.InputStream.Read(bytContent, 0, iLength)
With cmd
.Parameters.Add("@ECOID", SqlDbType.Int)
.Parameters.Add("@FromType", SqlDbType.NVarChar, 50)
.Parameters.Add("@DocName", SqlDbType.NVarChar, 250)
.Parameters.Add("@OldRev", SqlDbType.NVarChar, 50)
.Parameters.Add("@NewRev", SqlDbType.NVarChar, 50)
.Parameters.Add("@NTLogin", SqlDbType.NVarChar, 100)
.Parameters.Add("@DisplayName", SqlDbType.NVarChar, 200)
.Parameters.Add("@FileName", SqlDbType.NVarChar, 255)
.Parameters.Add("@FileSize", SqlDbType.Real)
.Parameters.Add("@FileData", SqlDbType.Image)
.Parameters.Add("@ContentType", SqlDbType.NVarChar, 50)
.Parameters("@ECOID").Value = ECOID
.Parameters("@FromType").Value = From
.Parameters("@DocName").Value = DocName
.Parameters("@OldRev").Value = OldRev
.Parameters("@NewRev").Value = NewRev
.Parameters("@NTLogin").Value = NTLogon
.Parameters("@DisplayName").Value = DisplayName
.Parameters("@FileName").Value = sFileName
.Parameters("@FileSize").Value = iLength
.Parameters("@FileData").Value = bytContent
.Parameters("@ContentType").Value = sContentType
.ExecuteNonQuery()
'.ExecuteScalar()
End With
Catch ex As Exception
Response.Write(ex)
'Handle your database error here
conn.Close()
End Try
View 1 Replies
View Related
Dec 7, 2000
Here's my delema, I have a file that's 308 bytes wide by 5.7 million records. The record length is fixed and the position and width of the known within the record. When I run DTS I recieve this error Msg MS DTS flat file provide and Err Diesdription: error creating file mapping view: not enough storage is available to process this command. Then when I try to continue with the wizard, it will not allow me to separate the data into the format that I need. Is there any other way to import this file using DTS?
View 1 Replies
View Related
Dec 19, 2007
Hi€¦
During my web search looking for a solution I ran across SQL CE 3.5 articles. My questions about SQL CE 3.5 are:
1) Can SQL CE 3.5 handle a 4 €“ 6 GB file
- Read
- Parse (SQL)
2) Can SQL CE 3.5 act as a standalone client that a user can view a large (4-6 GB) text file?
- Will I need a .NET (small) client to read the large (4-6 GB) text file?
More info:
The text file will reside on the machine where the SQL CE 3.5 is installed. There is no pull to get the data.
Thank you (in advance)€¦
SQL CE 3.5
View 3 Replies
View Related
Oct 22, 2007
Hello,
I created a SSIS solution for reading data from dbase and storing them in SQL Server. In a ForEachDirectory-Loop up to one thousand dbase files are read and stored. The system where the packages are running has 16 GB RAM.
For the first few hundred dbase files everything goes fine, but then, the RAM seems not to suffice any more and a temp file is created (I changed the path in BufferTempStoragePath).
How can it be that there is a need to create temp files if there is so much RAM available?
Why is the RAM filled more and more during the SSIS package execution?
Is there anything I can do to release some of it? (it is running in a loop and there is no need to store all the data)
Could it be caused by dbase?? (I use Microsoft Jet 4.0 OLE DB Provider)
Another thing is that the temp file is not stored in the path I set in BufferTempStoragePath.
There are sufficient permissions set, but temp file is still created in user temp folder...
Any kind of help is very much appreciated!
Best Regards,
Stefan
View 5 Replies
View Related
Jul 28, 2014
I need to create script that will import large XML files (500 - 7GB) on a daily basis and store the data in a relational db structure.
What is the best and fastest way of importing such files. I have played around with smaller files and found the following.
1. SSIS XML Data Source: It doesn't seem to like the complex elements types and throws out the file.
2. Using Bulk File Import, sorting the file in XML variable and using XQuery to parse the file: This works but it can't take a file more than 2GB in size, so I can't use this method.
3. C# + XML Serialization: This also works, but seems to be terribly slow. I open the DB connection once, so it doesn't open and close for each db call, but still seems like it takes a long time.
how to import large XML quickly in a relational table structure?
View 9 Replies
View Related
Aug 13, 2007
What is the easiest way to get a large fixed width text file (200 columns) defintion into SSIS? To have to define each column with the ruler would be very cumbersome.
View 5 Replies
View Related
Jul 16, 2007
Hi ,
Is there any method by which I can divide the large flat file into certain number of small files keeping the header in each of the sub files?
Regards,
Prash
View 4 Replies
View Related
Aug 29, 2007
I have several databases that have grown to 300 GB and would like to distribute the data into multiple files across multiple drives. Can I create a new database that is spread across the new drives and use a full backup to restore or am I stuck with unloading the data table by table?
View 3 Replies
View Related
Sep 11, 2007
Hello,
I am attempting to restore the database from within VB.NET application I am making the following 3 calls:
RESTORE FileListOnly FROM DISK = 'C:MyDatabase.dat'
USE Master RESTORE DATABASE MyDatabase FROM DISK = 'C:MyDatabase.dat' WITH NORECOVERY, MOVE 'MyDatabase' TO 'C:Program FilesMicrosoft SQL ServerMSSQLDataMyDatabase.mdf', MOVE 'MyDatabase_log' TO 'C:Program FilesMicrosoft SQL ServerMSSQLDataLDFMyDatabase.ldf', REPLACE
RESTORE DATABASE MyDatabase FROM DISK = 'C:MyDatabase.dat'
using SMO. This logic works fine with small *.dat files, however when using *.dat file of about 4Gb I get an error on the 3d restore database call:
ExecuteNonQuery failed for Database 'master'.
An exception occurred while executing a Transact-SQL statement or batch.
Timeout expired. The timeout period elapsed prior to completion of the operation or the server is not responding.
Operator aborted backup or restore. See the error messages returned to the console for more details.
ExecuteNonQuery failed for Database 'master'.
An exception occurred while executing a Transact-SQL statement or batch.
Timeout expired. The timeout period elapsed prior to completion of the operation or the server is not responding.
Operator aborted backup or restore. See the error messages returned to the console for more details.
The same program/logic also works fine when I use MS SQL 2005 and it runs fine from MS SQL 2005 Query Analyzer for both 2005 and 2000 databases. There seem to be only problem with MS SQL 2000 from within VB.NET. Anybody has any idea? I'd appreciate any response. Thanks
Eugene
View 6 Replies
View Related
Nov 23, 2005
My problem is trying to calcuate whether a room is booked during a dateperiod.I have a table with two columns (Start and End date).I need some SQL code to calculate whether a room is booked during a daterange.e.g.The booking entry is:Start21/11/2005End25/11/2005Any help on this would be appreciated.The End Date indicates the last night of the stay.Regards,Steven*** Sent via Developersdex http://www.developersdex.com ***
View 5 Replies
View Related
Jun 24, 2014
I have a master table containing details of over 800000 surveys made up of approximately 400 distinct document names and versions. Each document can have as few as 10 questions but as many as 150. Each question represents one row.
My challenge is to create a separate spreadsheet for each of the 400 distinct document names and versions containing all the rows and columns present in the master table. The largest number of rows would be around 150 and therefore each spreadsheet will not be very big.
e.g. in my sample data below, i will need to create individual Excel files named as follows . . .
"Document1Version1.xlsx" containing all the column names and 6 rows for the 6 questions relating to Document 1 version 1
"Document1Version2.xlsx" containing all the column names and 8 rows for the 8 questions relating to Document 1 version 2
"Document2Version1.xlsx" containing all the column names and 4 rows for the 4 questions relating to Document 2 version 1
I assume that one of the first things is to create a lookup of the distinct document names and versions assign some variables and then use this lookup to loop through and sequentially filter the master table data ready for creating the individual Excel files.
--CREATE TEMP TABLE FOR EXAMPLE
IF OBJECT_ID('tempdb..#excelTest') IS NOT NULL DROP TABLE #excelTest
CREATE TABLE #excelTest (
[rowID] [nvarchar](10) NULL,
[docName] [nvarchar](50) NULL,
[Code] .....
--Output
rowIDdocNamedocVersionquestionblankField
1document11q1NULL
2document11q2NULL
3document11q3NULL
4document11q4NULL
5document11q5NULL
6document11q6NULL
[Code] .....
View 9 Replies
View Related
Dec 8, 2006
Hello everyone,
This is my first time posting here, I hope this questions has not been asked before. I tried to search for it but I came not with nothing.
Recreating the error :
I am using VS2005. I created a Pocket PC 2003 project.
I have downloaded the SQL Server Compact Edition and installed it. I get the System.Data.SqlServerCe.dll file from the installation directory.
I reference to that DLL using Add Reference in VS2005.
Build it. In the Bin folder, a long list of files suddenly appears.
System.data.dll
System.data.oracleClient.dll
system.web.dll
system.enterpriseservices.dll
system.enterpriseservices.wrapper.dll
system.transactions.dll
and the rest of your original files in Bin
The worst of it all, all of these files are deployed into the Emulator! Causing it to run out of memory and unable to deploy.
Something is not right here, I just cannot figure it out! If this happens, each mobile devices can hold one applications. Thats not the way it should be, right?
If you have solved this before, do help. I am at my wits end at the moment.
Thanking you in advance.
Sincerely,
Lasker
View 1 Replies
View Related
Aug 22, 2005
i need a script that:
select all records from a room, if the records in that room have a recordcount less then 100 records continue with next room until
we have 100 records, the next time i run this script, i must give me other rooms (random)
View 9 Replies
View Related
Jul 4, 2006
Hi,
We are processing 60,00, 000 rows(2 GB file) available in a flat file and loading them in to a database tables using OLEDB Destination components. In the data pipeline of an SSIS package we have 1 flat file source reader, 7 look up components(full cache mode), 1 multicast component and 2 OLE DB destinations with fast load option.
We have observed that first 10,00, 000 rows are processed and loaded in to target tables in just 4 minutes time. The second set of 10,00, 000 rows are processed in 15 minutes time. After this for processing each 1,00,000 rows SSIS is taking approximately 8 - 10 minutes time. We are not able to identify the reasons for the unexpected behaviour of SSIS.
We thought that as the input file size is 2 GB SSIS is not able to manage and slowing down over time of execution. We did split the big input file in to 60 small 37 MB (approx) size files. Then we modified the package by adding For-Each loop task to process all the 60 small files and load them in to database server sequentially. Even in this approach also we have identified data loading has slowed down drastically after processing 13 files.
In order to verify is there any problem with reading source file or transformation, we have replaced OLEDB destinations component with Flat File destinations. With Flat file destination the time taken for processing rows is very constant. For every 8 minutes package is able to process 10,00,000 rows and write them in to the destination files. So, there is no problem with the with either Look up components or flat file source reader.
We are sure that target database server is in same state/condition from the starting to the end of package execution. The client box in which we are running the package is having 1 GB RAM. During package execution time the CPU usage is at 30 % and PF usage is 580 MB. SP1 is also installed on both Client and Server.
Does any one have clue what is causing slow down of data load over the time of package execution?
View 3 Replies
View Related
Jul 26, 2006
I'm currently experiencing major problems with SSIS when opening and editing large .DTSX package files that contain Exec DTS 2000 Tasks which have the package data loaded internally. I have no issues if I point the task to a .DTS file, or to an actual DTS package on a SQL 2000 server - but if I load the package internally then once the underlying .DTSX file gets over around 17MB or so in size (which doesnt take long making a few edits to even fairly simple packages now and then), I start to experience major issues with VS/BIDS 2005 crashing randomly when I try to perform any action with the package (open, save etc). Things like OutOfMemory exception errors, followed by the properties of Exec DTS 2000 task being deleted, and also sometimes accompanied by messages about the application not being installed properly.
Again its ONLY when the underlying .DTSX file reaches a certain size limit, and only when I've got an Exec DTS 2000 task with the package loaded internally. I've replicated the issue using several different package files on several different machines (even on servers with lots of memory, fwiw).
Can anyone out there help me with this? SSIS - namely SSIS Exec DTS 2000 package tasks - are our lifeblood at my company and this trend of random and serious crashing on large package files is very disturbing to say the least.
thanks,
Wil
View 1 Replies
View Related
Jan 11, 2008
Hi,
I have stumbled on a problem with running a large number of SSIS packages in parallel, using the €œdtexec€? command from inside an SQL Server job.
I€™ve described the environment, the goal and the problem below. Sorry if it€™s a bit too long, but I tried to be as clear as possible.
The environment:
Windows Server 2003 Enterprise x64 Edition, SQL Server 2005 32bit Enterprise Edition SP2.
The goal:
We have a large number of text files that we€™re loading into a staging area of a data warehouse (based on SQL Server 2k5, as said above).
We have one €œmain€? SSIS package that takes a list of files to load from an XML file, loops through that list and for each file in the list starts an SSIS package by using €œdtexec€? command. The command is started asynchronously by using system.diagnostics.process.start() method. This means that a large number of SSIS packages are started in parallel. These packages perform the actual loading (with BULK insert).
I have successfully run the loading process from the command prompt (using the dtexec command to start the main package) a number of times.
In order to move the loading to a production environment and schedule it, we have set up an SQL Server Agent job. We€™ve created a proxy user with the necessary rights (the same user that runs the job from command prompt), created an the SQL Agent job (there is one step of type €œcmdexec€? that runs the €œmain€? SSIS package with the €œdtexec€? command).
If the input XML file for the main package contains a small number of files (for example 10), the SQL Server Agent job works fine €“ the SSIS packages are started in parallel and they finish work successfully.
The problem:
When the number of the concurrently started SSIS packages gets too big, the packages start to fail. When a large number of SSIS package executions are already taking place, the new dtexec commands fail after 0 seconds of work with an empty error message.
Please bear in mind that the same loading still works perfectly from command prompt on the same server with the same user. It only fails when run from the SQL Agent Job.
I€™ve tried to understand the limit, when do the packages start to fail, and I believe that the threshold is 80 parallel executions (I understand that it might not be desirable to start so many SSIS packages at once, but I€™d like to do it despite this).
Additional information:
The dtexec utility provides an error message where the package variables are shown and the fact that the package ran 0 seconds, but the €œMessage€? is empty (€œMessage: €œ).
Turning the logging on in all the packages does not provide an error message either, just a lot of run-time information.
The try-catch block around the process.start() script in the main package€™s script task also does not reveal any errors.
I€™ve increased the €œmax worker threads€? number for the cmdexec subsystem in the msdb.dbo.syssubsystems table to a safely high number and restarted the SQL Server, but this had no effect either.
The request:
Can anyone give ideas what could be the cause of the problem?
If you have any ideas about how to further debug the problem, they are also very welcome.
Thanks in advance!
Eero Ringmäe
View 2 Replies
View Related
Jan 24, 2008
Hi,
I am developing a .Net application for a Guest house for my final year project. I'm using C# and SQL Server. I want a way to look up if a room is available or not on a particular date.
First I came up with...
Room: Room_id(PK), Room_type, Room_rate_pp, Room_availability
but how can I distinguish whether a room is available on a particular date? I think Room_availability is useless in this table because it can't do this. Do I need some sort of calendar table or something?
Your help with this would be greatly appreciated. it's annoying me so much, and I can't move on with the project until I have my database finalised :eek:
Thanks,
Sean
View 1 Replies
View Related
Feb 7, 2006
I downloaded the 800 mb file (32bit) on my XP (SP2) box and it comes up with a message that I do not have enough disk space on c: drive. Well I have 180GB of space, so that cant be it. Is it because it requires an NTSF drive ad not a FAT32 drive?
View 1 Replies
View Related
Oct 26, 2007
i have 20 .sql files that were written to create new views in a Database. We are moving over to a new database and need to run all of these scripts against the new database to add the views (each time we refresh for testing) Is there away to write a procedure that calls all of these .sql files so they do not have to be opened and run individually? I am thinking something like
EXEC xp_sqlmaint '
or
EXEC Master..xp_cmdshell
but I can not figure out the right syntax or else these are the wrong direction. What do you suggest?
Thanks,
Swoozie
View 5 Replies
View Related