Read File Size In Sql Query
May 23, 2008hi
how can read file size in sql query
hi
how can read file size in sql query
I have a web-based admin section for a site and I would like to be able toquery the SQL Server database size and display it within my admin area. Isthere a function or method of doing this? My database is hosted by a thirdparty...I appreciate any tips or advice you can provide!Rob
View 5 Replies View Related
I am trying to resize a database initial log file from 500M to 2M. I€™m using€?
ALTER DATABASE <DBNAME> MODIFY FILE ( NAME = <DBLOGFILENAME, SIZE = 2 ) "
And I'm getting "MODIFY FILE failed. Specified size is less than current size." I tried going into the database properties and setting the log file to 2M, but it doesn€™t keep the changes.
Any help with this process?
OBJECTIVE: I would like to read a text file from SQL Server 2000, read the text file content, and load its conntents in a RichTextBoxTHINGS I'VE DONE AND HAVE WORKING:1) I've successfully load a text file (ex: textFile.txt) in sql server database table column (with datatype Image) 2) I've also able to load the file using a Handler as below: using System;using System.Web;using System.Data.SqlClient;public class HandlerImage : IHttpHandler {string connectionString;public void ProcessRequest (HttpContext context) {connectionString = System.Configuration.ConfigurationManager.ConnectionStrings["NWS_ScheduleSQL2000"].ConnectionString;int ImageID = Convert.ToInt32(context.Request.QueryString["id"]);SqlConnection myConnection = new SqlConnection(connectionString);string Command = "SELECT [Image], Image_Type FROM Images WHERE Image_Id=@Image_Id";SqlCommand cmd = new SqlCommand(Command, myConnection);cmd.Parameters.Add("@Image_Id", System.Data.SqlDbType.Int).Value = ImageID;SqlDataReader dr;myConnection.Open(); cmd.Prepare(); dr = cmd.ExecuteReader();if (dr.Read()){ //WRITE IMAGE TO THE BROWSERcontext.Response.ContentType = dr["Image_Type"].ToString();context.Response.BinaryWrite((byte[])dr["Image"]);}myConnection.Close();}public bool IsReusable {get {return false;}}}'>'>
<a href='<%# "HandlerDocument.ashx?id=" + Eval("Doc_ID") %>'>File
</a>- Click on this link, I'll be able to download or view the file WHAT I WANT TO DO, BUT HAVE PROBLEM:- I would like to be able to read CONTENT of this file and load it in a string as belowStreamReader SR = new StreamReader()SR = File.Open("File.txt");String contentText = SR.Readline();txtBox.text = contentText;BUT THIS ONLY WORK FOR files in the server.I would like to be able to read FILE CONTENTS from SQL Server.PLEASE HELP. I really appreciate it.
I have a log file that is approximately 50 GIG. I backed up just the log and the file size of the .bak is 192 GIG . Why is this? Shouldn't it be closer to the 50 GIG.
Normally I wouldn't let log grow this much. But we are in process of getting new server up and running and don't have backups going yet. They are working on getting that up and running this week.
So I did a log backup to give me back some log space for now but was concerned when I saw the size of the .bak file.
When I view media contents of the backup device it shows one tranaction log back up and size of 192 GIG.
What is up with this. I know in SQL 2000 the log backup files where never this big. they were about the size of the log itself.
Any ideas?
Stacy
I installed sql 2005 a while back. Then I recently found out my file system was fat32 (I don't understand why the hardware people did this...) and I had to convert to NTFS. Naturally the sql service no longer worked so I uninstalled inorder to reinstall now I can't reinstall it I keep getting this message
native_error=5039, msg=[Microsoft][SQL Native Client][SQL Server]MODIFY FILE failed. Specified size is less than current size.
I'll try to post the full log in a new post.
I have one db test with one .mdf and .ldf file...mdf file size is 100mb and for some reson i removed all the tablesfrom that .mdf file and transfer it into new secondary file so all thetables moved into secondary file now i want to reduce the first .mdffile from 100 mb to 50mb is that possible,it's showing 90mb is free.Please reply
View 1 Replies View RelatedI receive Error: 3967, Severity: 17, State: 1. Insufficient space in tempdb to hold row versions. We have 8 data files for temp db of 10210 GB size and given 10240 GB as max size.
As MS suggest to calculate the temp db file size and growth rate we need to monitor the perform counters Free Space in Tempdb (KB) and Version Store Size (KB) in the Transactions object.
basic formula: [Size of Version Store] = 2 * [Version store data generated per minute] * [Longest running time (minutes) of your transaction
My report disk utilizations says tempdb is full ? I thonk I need a shrink for the file .
Still I am confused in calculating the size , My perform counter gives me data as such
Free Space in tempdb (KB)Â Â Â Â Â Â Â Â Â Â Â Â Â Â 279938496
Version Generation rate (KB/s)Â Â Â Â Â Â Â Â Â Â 53681040
Version Cleanup rate (KB/s)Â Â Â Â Â Â 53422320
Version Store Size (KB)Â Â Â Â Â 258720
Version Store unit count      22
Version Store unit creation                     774
Version Store unit truncation        752
Hello Experts,
I am createing one task (user control) in SSIS. I have property grid in my GUI and 2 buttons (OK & Cancle).
PropertyGrid has Properties like SourceConnection, OutputConnection etc....right now I am able to populate Connections in list box next to Source and Output Property.
Now my question to you guys is depending on Source Connection it should read that text file associated with connection manager. After validation it should pick header (first line of text file bases on record type) and write it into new file when task is executed. I have following code for your reference. Please let me know I am going in right direction or not..
What should go here ?
->Under Class A
public override DTSExecResult Execute(Connections connections, VariableDispenser variableDispenser, IDTSComponentEvents componentEvents, IDTSLogging log, object transaction)
{
//Some code to read file and write it into new file
return DTSExecResult.Success;
}
public const string Property_Task = "CustomErrorControl";
public const string Property_SourceConnection = "SourceConnection";
public void LoadFromXML(XmlElement node, IDTSInfoEvents infoEvents)
{
if (node.Name != Property_Task)
{
throw new Exception(String.Format("Invalid task element '{0}' in LoadFromXML.", node.Name));
}
else
{
try
{
_sourceConnectionId = node.Attributes.GetNamedItem(Property_SourceConnection).Value;
}
catch (Exception ex)
{
infoEvents.FireError(0, "LoadFromXML", ex.Message, "", 0);
}
}
}
public void SaveToXML(XmlDocument doc, IDTSInfoEvents infoEvents)
{
try
{
// // Create Task Element
XmlElement taskElement = doc.CreateElement("", Property_Task, "");
doc.AppendChild(taskElement);
// // Save source FileConnection
XmlAttribute sourcefileAttribute = doc.CreateAttribute(Property_SourceConnection);
sourcefileAttribute.Value = _sourceConnectionId;
taskElement.Attributes.Append(sourcefileAttribute);
}
catch (Exception ex)
{
infoEvents.FireError(0, "SaveXML", ex.Message, "", 0);
}
}
In UI Class there is OK Click event.
private void btnOK_Click(object sender, EventArgs e)
{
try
{
_taskHost.Properties[CustomErrorControl.Property_SourceConnection].SetValue(_taskHost, propertyGrid1.Text);
btnOK.DialogResult = DialogResult.OK;
}
catch (Exception ex)
{
Console.WriteLine(ex);
}
#endregion
}
I have a filetable that contains a binary file. I need to do a selective read of the file stored in the file table. I can write a C# CLR function that will open the file, read n bytes the from a starting byte. Or I can write a SQL statement that reads the stream in the filetable into a VARBINARY variable using SUBSTRING beginning at the starting byte (offset from 1) for the same n bytes.
Both give me the same result. However, the SQL statement takes considerably longer to read. I know there is overhead in reading through SQL (interpreted language), but the difference in performance is substantial, and I can only attribute this performance degradation if SQL first tries to "load" the entire stream before it identifies the portion of the stream that it needs to read beginning at the starting byte offset.
I wonder if this is the case or if there is another option to read a stream from a filetable directly through SQL queries that is more efficient.
Hi All,
I need to read a csv file, which is in remote server using SQl Bulk Insert Command.
Can I read a file Which is in remote server using BULK INSERT.
Thank you.......
Here are my scenarios:
We have an application with replicated environment setup on sql server 2012 . Users will have a replica on their machines and they will replicate to the master database. It has 3 subscriptions subscribed to the publications on the master db.
1) We set up a replica(which uses sql server 2012) on a machine with no sql server on it. After the initial synchronization(used replmerge tool) the mdf file has grown to 33gigs and ldf has grown to 41 gigs. I went to sql server management studion . Right click and checked the properties of the local database. over all size is around 84 gb with little empty free space available.
2) We set up a replica(which uses sql server 2012) on a machine with sql server 2008 on it. After the initial synchronization(used replmerge tool) the mdf file has grown to 49 gigs and ldf has grown to 41 gigs. I went to sql server management studio , Right click and checked the properties of the local database. over all size is around 90 gb with 16 gb free space available.
3) We set up a replica(which uses sql server 2012) on a machine with sql server 2012 on it. We have dropped the local database and recreated the local db and did the initial synchronization using replmerge tool. The mdf file has grown to 49 gigs and ldf has grown to 41 gigs. I went to sql server management studio , Right click and checked the properties of the local database. over all size is around 90 gb with 16 gb free space available.
Why it is allocating the space differently? This is effecting our initial replica set up times.
I need to write a process to get file size in kb and record count in a file. I was planning on writing a c# console app that takes the file path and name as a param however should i use a CLR?
I cant put a script in the ssis when it's bringing the file down because it has been deemed that we only use ssis for file consumption.
What is the recommended size and file growth for a database and log file? We will be storing approx 10000 records a day.Currently we have the following:
CREATE DATABASE Dummy
ONÂ
PRIMARY
( NAME = Dummy_data,
  FILENAME = 'D:....DATADummy.mdf',
  SIZE = 250MB,
  FILEGROWTH = 25MB )
LOG ON
( NAME = Dummy_log,
  FILENAME = 'D:....DATADummy_log.ldf',
  SIZE = 50MB,
  FILEGROWTH = 5MB ) ;
GO
I have a database whose log file size is 4 time greater then data file size, and its continuously growing day by day. Recently face limited disk related issue.
Is there any way to truncate log file???
What is impact on db if i truncate log file???
Is there any way to prevent this file continuously growing???
is there limitation for size of file to store in db by filestream in sql server 2008?or it accept all sizes?
View 1 Replies View RelatedAnyone reading XML disk-files into SQL Server?
I have a process that I may want to do this with.
It would be a stored procedure that would read the XML attributes into 2 tables, the number of attributes could be 1-N, so I thought XML would be a good choice. Also, one of the attributes could be up to 4000 characters. I think this may limit our options, can 100-150 4000 character strings be passed in a standard call to a query/proc in SQL?
Currently the client application makes round-trip network calls to save upwards of 100 pairs of data. 1 header row, and many detail rows. All within a transaction.
I think If we move an XML file to the SQL box, then do all the import/save work on the "Server" side it would be much better. Cutting the transaction time down a lot by not doing so many round-trips at network speed...
Thoughts?
Hi,
i'm trying to write this script that check my database file and log size(in MB) and insert them into a table.i need the following columns
dbid,dbname,compatability_level,recovery_model,db_size_in_MB,log_size_in_MB.
i try to write this a got stuck.
select sysdb.database_id,sysdb.name,sysdb.compatibility_level,
sysdb.recovery_model_desc,sysmaster.size from sys.databases sysdb,sys.master_files sysmaster
where sysdb.database_id = sysmaster.database_id
can anyone help me with this script?
THX
We have 2 SQL Server 2k5 servers running the same build - 9.0.2047 . When I backup any database from one server and attempt to restore it to the other, the log file generally increases by 100 fold. It errors out after I try to restore a 100MB db and it tries to create a 9.8GB log file. This happens both when I use the GUI to restore and when I restore from a T-SQL script. What am I doing wrong?
Thanks in advance.
Any one can help me how to read a file(.txt) with in the store procedure?
View 2 Replies View Relatedhi,
i have a requirement in which i need to read from a .ini file in the stored procedure of sql server 2K.
is it possible? i tried searching on google but i cannot find anything that can help.
Can any one tell me how to read the XML file using sql syntax.
I hav also posted my full requirement why I need to read the XML file in Transact_Sql forum list.
Hi All
I am trying to load DBF files into SQL server within CLR (actually if just running the select statement outside, say within the SQLQuery window, i got the same result), but with the following error:
Msg 7314, Level 16, State 1, Line 1
The OLE DB provider "VFPOLEDB" for linked server "MYDBF" does not contain the table "T8866064". The table either does not exist or the current user does not have permissions on that table.
I created the linked server in this way
EXEC sp_addlinkedserver
@server = 'MYDBF',
@provider = 'VFPOLEDB',
@srvproduct = 'My Data',
@datasrc = 'c:data'
i did not create the login since my SQL instance is running under a superaccount with all privilege.
What frustrates me is that i can read most of the dbf files, but just a few of them is not readable.
Can anyone give me some hints on it?
by the way, i am using vfp9.0
thanks
michael
Need help reading a binary file see below for details...
I have uploaded a csv file into a sql table.
Now i want to extract the data and insert the data in the csv file into another sql table.
What commands can i use in sql to extract/ read the data ?
Hi, I wrote a report builder that creates a CSV file but I can't redirect to the file to view it. ERROR Cannot find the Server!. This works fine on the dev machine but deploy it and it does not.
Dim Conn As New SqlConnection(ConfigurationSettings.AppSettings("ConStr"))
Dim Cm As SqlCommand
Dim dr As SqlDataReader
Dim FileName As String = Guid.NewGuid.ToString & ".csv"
Dim FilePath As String = Server.MapPath("") & "CSVFiles" & FileName
Dim fs As FileStream = New FileStream(FilePath, FileMode.Create, FileAccess.Write)
Dim sw As StreamWriter = New StreamWriter(fs)
Dim Line As String
Dim lItem As System.Xml.XmlElement
Dim i As Int16
If ValidatePage() Then
Cm = New SqlCommand(BuildQuery, Conn)
Conn.Open()
dr = Cm.ExecuteReader()
Dim lXmlDoc As New System.Xml.XmlDocument()
lXmlDoc.LoadXml(txtHXml.InnerText)
For Each lItem In lXmlDoc.DocumentElement.SelectSingleNode("SELECT").ChildNodes
Select Case lItem.InnerText
Case "chkPRId"
Line &= "PR Id,"
Case "chkDateIssued"
Line &= "Date Issued,"
Case "chkOriginator"
Line &= "Originator,"
Case "chkBuyer"
Line &= "Buyer,"
Case "chkVendor"
Line &= "Vendor,"
Case "chkCostCode"
Line &= "Cost Code,"
Case "chkOracleRef"
Line &= "Oracle Reference,"
Case "chkTotal"
Line &= "Total,"
End Select
Next
Line = Line.Substring(0, Line.Length - 1)
sw.WriteLine(Line)
Line = ""
While dr.Read
For i = 0 To dr.FieldCount - 1
Line = Line & dr(i) & ","
Next
sw.WriteLine(Line)
Line = ""
End While
dr.Close()
Conn.Close()
sw.Close()
fs.Close()
Response.Redirect(FilePath)
End If
ResetPage()
PopulateListBoxFromXML()
Hi,
I received a SQL Server 6.5 database (.dat and .Log files). I'm currently running 7.0. Is there any way to read/import this data?
I tried to do this:
I have installed SQL Server 6.5. And After install, I have created a database with the same names (.dat an .log) and make it bigger than the first .dat and .log files.
Then, I have stoped the SQL server service, have deleted the .DAT and .LOG files that i have created, and have copied the first .DAT and .LOG in their place. I have Started the SQL server service and the database was beeing suspected. I stil unable to read data.
Can you help me?
Hi,
I need to read a binary file with extention *.dat like this:
tat.dat and import it in ms sql server 7.0.
Thanks for any help
Dr Bangaly Diané
hi
how can i read a txt file by using T-SQL commands.
thanx
How can I read one external file (*.txt,*.csv) through
stored procedure in MS Sql Server 2000 ?
On the database Interbase its very simple :
[CREATE TABLE table [EXTERNAL [FILE] ’ <filespec>’]
( <col_def> [, <col_def> | <tconstraint> …]);
EXTERNAL [FILE]“<filespec>
” Declares that data for the table under creation resides
in a table or file outside the database;
<filespec> is the complete file specification of the
external file or table]
Please advice
Thanks
Patricio
Hi,
Ik like to read data from a file (well formatted) into as MS-SQL database.
What is the best way to do this.
Jim
I am programming to read read sqlserver log file without shutdowning sqlserver. However it is said that the file is used by another process which I believe is sqlserver.
Is there any way to open log file without shutdowning server? I know that Log Explorer can read online log file. But I do not know the technology they are using.
I need to know how to read an xml file using SQL. (in SQLServer)
Then i will manipulate the data and update my tables.
May be i can put the xml file in the IIS root.
Then what is required for the rest..
Thanks in advance
Benny
I need to create/read files from t-sql? Does anyone know how can I do it?
Thanks