How To Manage A Paging Mecanism In Data Collection ?
May 6, 2008
Dear all,
I have an aplpication whih collect different types of history data from an SQL database. On type of those data are hitory log information where amount of records can be really big.
What i would like to implement is a kind of paging mecanism for colelcting those data.
fro example the first call will retrun the first 100 rows, then the next call would get rows 101 to 200 . etc....
How to perform thsi behaviour from SQL side ?
regrds
serge
View 2 Replies
ADVERTISEMENT
Sep 20, 2005
What is the better table design for a data collection application.1. Vertical model (pk, attributeName, AttributeValue)2. Custom columns (pk, custom1, custom2, custom3...custom50)Since the data elements collected may change year over year, whichmodel better takes of this column dynamicness
View 7 Replies
View Related
Sep 28, 2015
I've been quite excited about SQL Server MDS that should allow non-IT staff to easily maintain data. However maintaining data that have many-to-many relationships seems to be quite a pain. I believe the best way is:
Open up your MDS web interfaceGo to entities > product (for example)Add a new member and fill the details Click "product parts" in the bottom right "Related Entities" partAdd a new memberTry and find the product you just created from the dropdownlistSelect the first part and click OKAgain try and find the product you have created from the dropdownlistSelect the second part and click OKRepeat...Close the tab on your browser and finish your product entity.How I wish it worked:
Open up your MDS web interfaceGo to entities > product (for example)Add a new member and fill the detailsCheck a checkbox for each part visible under "product parts" in the bottom right "Related Entities" partFinish your product entity.
View 3 Replies
View Related
Mar 7, 2008
I need to feed head office sql server with the data from regional servers. Servers are spread through all continents
Data input done locally on Head office server as well and plus need to ship data from other servers.
So clarify this - Head office server is not standby one. Mirroring is out of the picture, I think..
Initially, I thought ship a log every 15 min and restore on Head office server but is this going to create an issue for the local data processing?
Any bright ideas welcome!
View 2 Replies
View Related
May 30, 2014
i am trying to configure data collector on my server. so i configured data collector on server A and setup on server B. but the "Query statistics collection set" do not show me any data.
i right click and select "collect and upload now " item and get success result for this. but in the report i cant see any data...
also in the log page of data collection i see so many errors with messages like this:
"Failed to create kernel event for collection set: {2DC02BD6-E230-4C05-8516-4E8C0EF21F95}. Inner Error ------------------> Cannot create a file when that file already exists."
i tried some solution like disabling and enabling again, re-configuring, removing and configuring again .... but none of them work right.
View 2 Replies
View Related
Apr 14, 2008
Hello,I am working on an ASP.NET web site using an SQL database. Is there a way to Insert/Delete and Update data in the database without creating all the forms in ASP.NET?Since I am the only person to work with the database it would be easier for me than creating all ASP.NET forms.Thanks,Miguel
View 1 Replies
View Related
Apr 14, 2008
Hello,
I am working on an ASP.NET web site using an SQL database.
Is there a way to Insert/Delete and Update data in the database without creating all the forms in ASP.NET?
Since I am the only person to work with the database it would be easier for me than creating all ASP.NET forms.
Thanks,
Miguel
View 2 Replies
View Related
Mar 30, 2014
It is possible that Data Collection can cause massive increasing MB/sec to tempdb ? I cannot find connection with tempdb and I set cash file, but on same disk.
Or it can be something different? Last two weeks what I checked was Read/Write MB/s to tempdb increasing progressively.
One time it was about 20MB/sec
After it was reseting and again 1MB/sec..
What I checked , External company which install SQL Server made one file for tempdb, next week or during breaktime(it will be possible), I would like make 8files next weekend work.
Now I saw that TempDB mdf was still increased, but using was just 8-10%
View 2 Replies
View Related
May 19, 2008
hi,
this is sanjeev,
i have SSIS package, using my c# program i want to add one execute package task to this package's sequence container.
it is creating the new package with out any probelm. but when i opened the package and try to move the newly created exeute package task it is giving the following error.
the element cannot be found in a collection. this error happens when you try to retrieve an element from a collection on a container during the execution of the package
this is my code
{
Package pkg = new Package();
string str = (string)entry.Key;
pkg.Name = str;
alEntity = (ArrayList)entry.Value;
ConnectionManager conMgr;
Executable chPackage;
TaskHost executePackageTask;
Microsoft.SqlServer.Dts.Runtime.Application app = new Microsoft.SqlServer.Dts.Runtime.Application();
//string PackagePath = @"c:Genesis.dtsx";
//p = app.LoadPackage(PackagePath, null);
p = new Package();
p.LoadFromXML(parentPackageBody, null);
p.Name = str;
//Sequence seqContainer;
IDTSSequence seqContainer;
//seqContainer = (Sequence)p.Executables["Extract Genesis Data"];
seqContainer = ((Sequence)p.Executables[0]);
string packageLocation = @"Geneva Packages";
conMgr = p.Connections["SQLChildPackagesConnectionString"];
foreach (string val in alEntity)
{
if (seqContainer.Executables.Contains("Load_" + val) == false)
{
chPackage = seqContainer.Executables.Add("STOCK:ExecutePackageTask");
executePackageTask = (TaskHost)chPackage;
executePackageTask.Name = "Load_" + val;
executePackageTask.Description = "Execute Package Task";
executePackageTask.Properties["Connection"].SetValue(executePackageTask, conMgr.Name);
executePackageTask.Properties["PackageName"].SetValue(executePackageTask, packageLocation + ddlApplication.SelectedItem.Text + @"" + executePackageTask.Name);
}
}
app.SaveToXml(Server.MapPath("../SynchronizeScript/Packages/" + ddlApplication.SelectedItem.Text + @"") + str + ".dtsx", p, null);
}
please let me know what is the wrong in my code.
thanks in advance.
regards
sanjeev bolllina
sanjay.bollina@gmail.com
View 14 Replies
View Related
Apr 12, 2008
Hi Fellas,
I have a problem and it is described as follows:
1. The user may browse any website on the internet that may be in any language and enter the data into my application.
2. The data entered can be in English or any other language.
3. That’s were the problem arises; the data that enters the database other than English is displayed in wrong format like small boxes.
4. The user who had previously entered the data in the database can also alter that data. So when I display the data (other than English) to him it is not in the format in which the user had entered.
So can anybody help me out how to manage multilingual data?
View 2 Replies
View Related
Feb 6, 2000
I have a product where we feed an SQL 7 DB data collected from Manufacturing. Presently, the Data transport Program is in charge of getting prepared data from machines and inserting into the DB. This design assumes SQL7 is always ready and able - which is not true due to customer queries or backups or etc. consuming resources. There is a low level buffer in system at manufacturing level if transport dies, but transport is ignorant of SQL distress, so keeps hammering DB's frontdoor. I'm looking for help in putting SQL server in charge of allowing data in - when resources are adequate. Seems I need a function that can determine server stress QUICKLY to forestall transport program and a buffer for records at the transport layer. Anyone know / done a system where SQL server CHECKS for waiting records or OK's an external program to send until told to stop? What indicates (reliably) low server resources? Anyone ever used MSMQ?
"Black Holes are proof SOMEBODY, SOMEWHERE really did have a particularly bad Y2K problem!"
View 7 Replies
View Related
Apr 17, 2014
I Enabled Data Collection on one of the server and planned to make it as Centralised Management Data Warehouse I configured data collection on it and can view reports. Next, I went to other server and configured "Set up data collection" to use my first instance as the centralised Database. But the issue is I can only see reports of first server. Am I missing something here.
I did exactly as explained in this video [URL] .....
View 9 Replies
View Related
Jun 19, 2015
I created Data Collection in wrong DB, how can I change the DB or return to default(as it came with clean version of SS) ?
View 3 Replies
View Related
Jun 5, 2014
I have a two node SQL 2012 AlwaysOn HADR cluster (v11.0.3412) with 4 availability groups configured. The AG groups are set to synchronous mode and the secondary is not readable (we do not want the synchronous replica readable so we do not risk any reads causing contention so we maintain fast performance).
On the secondary we are getting a persistent failure with the Data Collector job called Collection_Set_3_Upload. The failure occurs within the second job step. That job step is executing the following command:
dcexec -u -s 3 -i "$(ESCAPE_DQUOTE(MACH))$(ESCAPE_DQUOTE(INST))"
The error message is as follows:
Log Job History (collection_set_3_upload) Step ID 2 Server CLUSTERNODE2
Job Name collection_set_3_upload
Step Name collection_set_3_upload_upload
Duration 00:00:07
[Code] ....
I know I can prevent this error message by enabling readable secondaries, but we do not want this.
I have tried stopping the data collection jobs and purging the cache directory but to no avail. It will succeed the first time then persistently fail again with the same message every time after that.
In addition, if I set the one failing AG group to readable secondary the job succeeds. So that means that 3/4 work fine, only this one is having an issue.
View 0 Replies
View Related
Oct 26, 2015
Last weekend many of our severs had a failed job "collection_set_3_upload". The error that occured is: "Violation of PRIMARY KEY constraint 'PK_ active_ sessions_ and_requests'. Cannot insert duplicate key in object 'snapshots.active_sessions_and_requests'. The duplicate key value is (2824333, 2015-10-25 02:54:49.7630000 +02:00, 1)."Last weekend we happened to go from summer time to winter time. i.e. the clock passed 02:00 - 3:00 two times during this night.
I.e. there is a bug in the Data Collector component that collects data for the Management Data Warehouse: it uses local time instead of UTC. I've created a Connect item to report it to Microsoft.URL...how do you get your process running again? the job will no longer run because it will every 5 minutes keep on trying to upload the conflicting data for the 2nd 2:00 - 3:00 period. I've only found one solution: get rid of all data collected but not yet uploaded.
You do this by stopping the Collection set (in SSMS go to Object Explorer -> <the server you want to fix> -> Management -> Data Collection -> System Data Collection Sets. Right click "Query Statistics" and select "Stop Data Collection Set").Then you delete the cached results from the sql server machine's harddisk. These cached results are in files located in a Temp folder on the sql machine itself, inside the AppData folder for the service account SQL Server Agent is running under. Usually it will be something like: c:Users<sql agent service account>AppDataLocalTemp.
Inside this folder delete all files that have 'QueryActivity' in their name. You'll loose all data collected since the start of wintertime, but at least your data collection process will work again.After this you can start the Collection set again by right clicking it and select "Start Data Collection Set". Every 5 minutes the data will be summarised and uploaded into your management data warehouse.
Posting Data Etiquette - Jeff Moden
Posting Performance Based Questions - Gail Shaw
Hidden RBAR - Jeff Moden
Cross Tabs and Pivots - Jeff Moden
Catch-all queries - Gail Shaw
View 0 Replies
View Related
Apr 10, 2001
Hi All,
How do I input a large text page (notepad) into a SQL column. Or assign a pointer to the data. I've tried to use BOL (writetext) and to no avail, I guess I'm missing something. I'm just using EM and Query analyzer. I thought this should be easy. Image data should work the same way.
TIA,
Dave
View 1 Replies
View Related
Jul 22, 2007
Hi Guys,
I'm using Visual Basic 2005 Express and SQL Server 2005 Express. I have textboxes on a VB form linked to 2 database tables.
I am wondering if it is possible to use just ONE BindingNavigator to manage data entry and updates to THE database tables. I initially thought I could manage the tables but have I encountered some problems:
i)When I entered a new record, and clicked on the SAVE BUTTON the new record the textboxes for the 1st table saves the record to the database, but the textboxes for 2nd table still retained data in them and are not saving theirs to the database.
ii) The same textboxes for the 2nd table are NOT allowing for updates too! Or, could it simply be that it is not possible to use this method for data entry and updates?
Thanks you.
View 1 Replies
View Related
Mar 18, 2015
I setup data collection on a production server to capture growth rates.
When I run the dis usage report, it shows a daily growth rate of over 500 megs. This seems excessive to me.
As a troubleshooting step I then ran sp_space used and got these results:
database_namedatabase_sizeunallocated space
rgc_prod 273442.63 MB3648.48 MB
reserved data index_size unused
265345488 KB164385384 KB99826072 KB1134032 KB
What should my next steps be to try and determine why there is so much growth? And isn't the index size rather large?
View 1 Replies
View Related
Apr 28, 2008
I got anywhere from a couple hundred to a hundred thousand records that need to be updated or inserted into their SQL Server 2005 end destination. What are some of the best ways to accomplish this? Right now we are doing it manually through line by line updates and inserts. Would I use BC or some other bulk import tool?
View 5 Replies
View Related
Apr 17, 2007
Hello,
I have a SSIS package with a Data Flow task. This task transfers the data from SQL Server 2000 to a table in SQL Server 2005.
I deployed and tested this package on the Test Server. Then put this package in a job and executed it - Works fine.
On the production server- If I execute the package through DTEXECUI, it works fine. But when I try executing it through a job- the job fails and it gives me following error:
Description: The external metadata column collection is out of synchronization with the data source columns. The "external metadata column "T_FieldName" (82)" needs to be removed from the external metadata column collection....
What I don't understand is, why are there no errors displayed when I execute the package through DTEXECUI.
Can anyone help me to resolve this issue.
Thanks.
View 3 Replies
View Related
Aug 27, 2015
We have several read-only nodes in our AlwaysOn cluster, which are set to use Synchronous-commit mode, which ensures that the logs are updated on the read-only nodes before any update statements complete. Even with this option, if we query a read-only node before the logs have been processed, we can read old data. I would like to know a strategy to ensure that a read-only query will definitely return up to date information. I had an idea that if I just used a different transaction type, like Serializable, that it might block the read-only query from actually getting the data until after the log file was processed, but I have not tried it, yet.
I would like to move more queries to the read-only nodes, in an effort to offload CPU utilization from the primary node.
View 2 Replies
View Related
Jun 12, 2014
This my stored procedure
ALTER PROCEDURE [dbo].[PageWise]
@PageIndex INT = 1
,@PageSize INT = 10
,@PageCount INT OUTPUT,
@CatId int
[code]....
I'm trying to paging data but page size doesn't work and it returns all data.
View 8 Replies
View Related
May 16, 2008
Hi,
How do I retrieve the connections (connection managers) collections from Custom Data Flow destination? ComponentMetadata.RuntimeConnectionCollection is empty. I would like to be able to access all the connections defined in the package from the custom data flow task.
I came across code in which it was possible to access the Connections collection using the IDtsConnectionService for custom task (destination). The custom task has access to serviceProvider, whcih can be used to get access to the IDtsConnectionService interface but not the custom data flow task.
Any help appreciated.
Thanks
Naveen
View 5 Replies
View Related
Jan 18, 2007
I was working for 2 days on integrating reporting services with MOSS 2007.
The problem i faces was to create a data source for the
uploaded report (Not created using a wizard ) that will connect
me to the SQL server.
As i clicked on the arrow near the uploaded report to manage the data source, i found no
data source for this item.
I would be pleased to know the howto create this data source and where.
Best Regards,
Lana
View 4 Replies
View Related
May 14, 2015
I'm trying to edit the Expressions of a Data Flow task. This seems to happen when I rename some of the Data Flow components but not always. The error I get is:
Element "[ADO Net Source].[SqlCommand]" does not exist in the collection "Properties"
However, if you look at the XML, this property does exist. So I'm not sure why this should occur.
I'm using SSIS 2008 R2 with Visual Studio 2008 V 9.0.30729.4462 QFE.
<component id="1" name="ADO Net Source" componentClassID="{2E42D45B-F83C-400F-8D77-61DDE6A7DF29}" description="Extracts data from a relational database by using a .NET provider." localeId="-1" usesDispositions="true" validateExternalMetadata="True" version="4" pipelineVersion="0" contactInfo="Extracts data from a relational database by using a .NET provider.;
[Code] ....
View 3 Replies
View Related
Apr 23, 2015
I am using sql server 2008 r2 on my end. I have created a database named testDB. I have a lot of tables with some log tables in this. some tables have contain lack of records in log table.
So my purpose is that I want to fix the table size of those tables(log tables) and want to move records in other database table placed on another location. So my database has no problem.
is there any way to make such above steps which I want for my database?
Is there already built any such functionality in sql server?
View 2 Replies
View Related
Mar 12, 2008
Hi all,
From the "How to Call a Parameterized Stored Procedure by Using ADO.NET and Visual Basic.NET" in http://support.microsft.com/kb/308049, I copied the following code to a project "pubsTestProc1.vb" of my VB 2005 Express Windows Application:
Imports System.Data
Imports System.Data.SqlClient
Imports System.Data.SqlDbType
Public Class Form1
Private Sub Form1_Load(ByVal sender As System.Object, ByVal e As System.EventArgs) Handles MyBase.Load
Dim PubsConn As SqlConnection = New SqlConnection("Data Source=.SQLEXPRESS;integrated security=sspi;" & "initial Catalog=pubs;")
Dim testCMD As SqlCommand = New SqlCommand("TestProcedure", PubsConn)
testCMD.CommandType = CommandType.StoredProcedure
Dim RetValue As SqlParameter = testCMD.Parameters.Add("RetValue", SqlDbType.Int)
RetValue.Direction = ParameterDirection.ReturnValue
Dim auIDIN As SqlParameter = testCMD.Parameters.Add("@au_idIN", SqlDbType.VarChar, 11)
auIDIN.Direction = ParameterDirection.Input
Dim NumTitles As SqlParameter = testCMD.Parameters.Add("@numtitlesout", SqlDbType.Int)
NumTitles.Direction = ParameterDirection.Output
auIDIN.Value = "213-46-8915"
PubsConn.Open()
Dim myReader As SqlDataReader = testCMD.ExecuteReader()
Console.WriteLine("Book Titles for this Author:")
Do While myReader.Read
Console.WriteLine("{0}", myReader.GetString(2))
Loop
myReader.Close()
Console.WriteLine("Return Value: " & (RetValue.Value))
Console.WriteLine("Number of Records: " & (NumTitles.Value))
End Sub
End Class
//////////////////////////////////////////////////////////////////////////////////////////////////////////////////////
The original article uses the code statements in pink for the Console Applcation of VB.NET. I do not know how to print out the output of ("Book Titles for this Author:"), ("{0}", myReader.GetString(2)), ("Return Value: " & (RetValue.Value)) and ("Number of Records: " & (NumTitles.Value)) in the Windows Application Form1 of my VB 2005 Express. Please help and advise.
Thanks in advance,
Scott Chang
View 29 Replies
View Related
Jan 28, 2008
i want to use datebase wich suport all latin languages but i cant find the collection
esspeacily
eng,
german,
france and Turksh
View 4 Replies
View Related
Feb 27, 2004
I recently migrated a database from server A to server B. The backup jobs I am trying to build on server B are failing because of the following error:
--------------
[Microsoft SQL-DMO] Error 21776: [SQL-DMO]The name 'WinDat' was not found in the Databases collection. If the name is a qualified name, use [] to separate various parts of the name, and try again.
-------------
How do I add this database to the databases collection so it will be recognized?
Thanks in advance for your time and help.
View 9 Replies
View Related
Aug 16, 2007
I can add two reportitem controls, ie reportitems!begbal.value + reportitems!deposits.value, without a problem. However, when I add the 3rd reportitem control to the expression, ie + reportitems!withdrawals.value, some really funky arithmetic occurs. All of these controls I am referring to are in the same group footer.
Any help out there would be greatly appreciated.
View 1 Replies
View Related
Feb 9, 2007
actuallu the earlier obviously worked but now i am fetching values from 2 colums and in collection we can pass 2 strings i did that.but then how to increment the collection c.will just post the code .
query fetching 2 records
can add only 1
NameValueCollection c = new NameValueCollection();
cmd2.CommandText="select Pname ,HoursWorked from TimeSheet1 ts , ProjResource pr,ProjectDetails pd where ts.rid = pr.rid and ts.pcode=pr.pcode and pd.Pcode = ts.Pcode and pr.Rid = '" + Ridtxt.Text + " '";
cmd2 .Connection = con1;
con1.Open();
SqlDataReader dr = cmd2.ExecuteReader();
while (dr.Read())
{
c.Add(dr["pname"].ToString(), dr["hoursworked"].ToString());
}
as earliar
supose c# 12
asp 11
only c# getting added
even though loop is while dr.read()
{
}
actually how to increment key as in c both strings are occupied
View 3 Replies
View Related
Jul 3, 2004
hi evryone
I have a question!
in my ASP.NET page I use a SqlDataAdapter object for working on my database
first I declare it as a global vaiable(SqlDataAdapter objDataAdapter;)
then I construct it in my Page_Load procedure like this
(objDataAdapter=new SqlDataAdapter(strSql,objConnention))
then when I want to use it in my other procedures an error tells me that your object does not exist
for solving this error I should construct my objDataAdapter in evry procedure using it.
but I think such variables should not be erased from Heap automatically(with Garbage Collection)
because it has a reference to Stack. I mean the global variable...
any opinion ?
Thanks
View 3 Replies
View Related
Feb 12, 2005
I am collecting companyID's form a data grid, I want to pass the selected values to a sproc via a variable. Any idea on the syntax?
this works using a query string within my code
WHERE (dbo.Promotions.ExpirationDate > GETDATE()) AND (dbo.Promotions.CompanyID IN (" + selectedCompanies + "))
this doesn't within my sproc
WHERE (dbo.Promotions_ByLink.ExpirationDate > GETDATE()) AND (dbo.Promotions_ByLink.CompanyID IN (@SelectedCompanies))
I also tried
WHERE (dbo.Promotions_ByLink.ExpirationDate > GETDATE()) AND (dbo.Promotions_ByLink.CompanyID IN (SELECT @SelectedCompanies))
and
WHERE (dbo.Promotions_ByLink.ExpirationDate > GETDATE()) AND (dbo.Promotions_ByLink.CompanyID IN (' + @SelectedCompanies + '))
Thanks
View 4 Replies
View Related