I created a website in Visual Studio and the data will be stored in SQL Server 2005. I publish the website in a local folder of my machine(because i don't have internet connection in my machine) and want to transfer these files from the internet cafe.
Let me tell you my problem about the "data source". The following statement i extracted from the web.config file and i want to know when i upload my website then what will the data source name. Whether i need to give IP address or name of server or what else. Give me some example?
And more over the following are the two ways connection to the database. the first one i have created database directly in SQL Server and another database i attached in my website directory. Please tell me in both circumstances what would be data source. Please tell me in detail.<connectionStrings>
Here i used SQL Server Express in my machine, if the Web server contains SQL Enterprise Edition then in scenario....
Here i am very confusion, if i want i can create database in my machine but in their server how?
ERROR [IM002] [Microsoft][ODBC Driver Manager] Data source name not found and no default driver specified Description: An unhandled exception occurred during the execution of the current web request. Please review the stack trace for more information about the error and where it originated in the code. Exception Details: System.Data.Odbc.OdbcException: ERROR [IM002] [Microsoft][ODBC Driver Manager] Data source name not found and no default driver specifiedSource Error: An unhandled exception was generated during the execution of the current web request. Information regarding the origin and location of the exception can be identified using the exception stack trace below. Stack Trace: [OdbcException (0x80131937): ERROR [IM002] [Microsoft][ODBC Driver Manager] Data source name not found and no default driver specified] System.Data.Odbc.OdbcConnection.HandleError(OdbcHandle hrHandle, RetCode retcode) +35 System.Data.Odbc.OdbcConnectionHandle..ctor(OdbcConnection connection, OdbcConnectionString constr, OdbcEnvironmentHandle environmentHandle) +131 System.Data.Odbc.OdbcConnectionFactory.CreateConnection(DbConnectionOptions options, Object poolGroupProviderInfo, DbConnectionPool pool, DbConnection owningObject) +98 System.Data.ProviderBase.DbConnectionFactory.CreateNonPooledConnection(DbConnection owningConnection, DbConnectionPoolGroup poolGroup) +27 System.Data.ProviderBase.DbConnectionFactory.GetConnection(DbConnection owningConnection) +47 System.Data.ProviderBase.DbConnectionClosed.OpenConnection(DbConnection outerConnection, DbConnectionFactory connectionFactory) +105 System.Data.Odbc.OdbcConnection.Open() +37 DBConnection.open() +12 ASP.global_asax.Session_Start(Object sender, EventArgs e) +35 System.Web.SessionState.SessionStateModule.RaiseOnStart(EventArgs e) +2163182 System.Web.SessionState.SessionStateModule.CompleteAcquireState() +154 System.Web.SessionState.SessionStateModule.BeginAcquireState(Object source, EventArgs e, AsyncCallback cb, Object extraData) +542 System.Web.AsyncEventExecutionStep.System.Web.HttpApplication.IExecutionStep.Execute() +90 System.Web.HttpApplication.ExecuteStep(IExecutionStep step, Boolean& completedSynchronously) +155 Version Information: Microsoft .NET Framework Version:2.0.50727.42; ASP.NET Version:2.0.50727.42 ODBC version 3.526.1830.0strconn = "DSN=testserver;uid=tester;pwd=tester;DATABASE=Test_Database"I've Test_Database on 2 machines (same data in both machines), one is in SQL 2005 server and another one is in SQL 2000 server.When the website was published, it can't work. (with both database). But it work in debug mode in Visual Studio2005.The website can't work when use ODBC connection but it work when use SqlClient.Help me solve this problem please. T_T Thanks in advance,
, Hi In this code how can I create a new data source and new data source view and model and structure that it run dynamic. In this code I have a lot of errors, that they are about server and database don€™t have in current code, In this code, first I should definition server or no?
How can I create data source and data source view and model and structure? Please say code of that, and guide me. databasename and srv is unknown. Do I add other reference with analysis services? Please explain about these codes: ************************************************************************ 1) RelationalDataSource dsNew = new RelationalDataSource( datasourceName, Utils.GetSyntacticallyValidID( datasourceName, typeof(RelationalDataSource)));
I have set up a new connection as a connection from data source, but I cannot see how to use this connection to create my Data Flow Source. I have tried using an OLE DB connection, but this is painfully slow! The process of loading 10,000 rows takes 14 - 15 minutes. The same process in Access using SQL on a linked table via DSN takes 45 seconds.
Have I missed something in my set up of the OLE DB source / connection? Will a DSN source be faster?
I'm after hosting a website with basically a table in it that is linked to a MS SQL Server 2005 database, which I want to update on a pretty much continual basis from my own server PC which I'm running a data mining tool that updates the MS SQL Server 2005 database.
Any idea on how I would achieve this, or any pointers would be much appreciated.
Can someone point me in the right direction. I would like to grab some data from an email (not as an attachment)and load into a table, but I have no idea where to begin. Also, can this data also be retreived from a website?
hi guys, i want to use SSIS package to download files from an http website. i am tried using Http connection managers and webservice task but it asking me the wsdl file. i dont know where that file is. can anyone please guide me? or suggest me any other way to do this?
hi everybody, i want to create data source and data source view for data mining, with using C Sharp. i have create data source and data source view and export to XML file, but when i change to another computer, run those XML file, it return error, when i run statement to create and biuld mining model, what can i change on xml or how to run XML on another computer sucessfully, and have i build data source and data source view, how to do it.?
Today I was making a few reports. When I tested the reports in Visual Studio, they worked great: I got the expected result. But when I deployed the reports to our reportserver the problem started. When I click on the directory in which my reports are deployed, I got my 4 reports. Till now everything worked correct. But when I click on a report to view the results it went wrong. I got an error: "Cannot create a connection to data source 'Live'" (Live is the name of our data source).
We are using the Windows Logons and I am sure that I have all the rights on the server, I gave myself 'sysadmin' rights, so it should work. I also have tried it with all the roles assigned on my account, but then it still won't work.
When I modify the data source, and set it to another server en database it works. The datasource 'Live' exists on a x64 MsSQL server, en the other datasource is on a x86 MsSQL server. Maybe that is the problem?
I was trying to load data using SSIS, Data Flow Task, OLE DB Source, source was a view to a OLE DB Destination (SQL Server). This view returns 420,591 rows from Query Analyzer in 21 seconds. Row length is 925. When I try to executed the Data Flow Task from SSIS, I had to stop the process after 30 minutes, because only 2,000 rows had been retrieved. I modified the view to retun top 440, 000 and reran. This time all 420, 591 rows were retrieved and written in 22 seconds. Next, I tried to use a TOP 100 Percent. Again, only 2,000 rows were return after 30 minutes. TempDB is on a separate SAN Raid group with 200 gig free, Databases on a separate drive with 200 gig free. Server has 13 gig of memory and no other processes were executing.
The only way I could populate the table was by using an Execute SQL Task and hard code an Insert into table selecting data from the view (35 seconds) from SSIS.
Have anyone else experience this or a similar issue? Anyone have a solutionexplanation?
I am in the process of designing a web application for our application that will be available to the general public.Basically will be used to collect information for a case report. I need to know if there is a way from our internal database driven application to push certain data up to the website that the database may be behind a firewall so I cannot make a direct connection to the external database. I thought I heard about some sort of data sharing protocol. I then will be storing the users entered data in the website database and pulling it from the internal application. So basically I need a secure way of sending and receiving data between a server/client app on a LAN system to the Web site database that could be hosted outside the LAN. I hope this made sense.
hi there.. I have write a code in the submit button using vb code so that when people key in their email.. it will be saved in my created database called test.mdf. but when i debug it, i tried to write an my email address in the textbox in my website but when i click at the submit button, the web page display the email was not enter in my database. here is the code, can anybody help me. i'm really lost here.. Protected Sub SubmitButton_Click(ByVal sender As Object, ByVal e As System.EventArgs) Dim TestDataSource As New SqlDataSource()TestDataSource.ConnectionString = ConfigurationManager.ConnectionStrings("TestConnectionString1").ToString()
TestDataSource.InsertCommandType = SqlDataSourceCommandType.Text TestDataSource.InsertCommand = "INSERT INTO Email(EmailAddress, IPAddress, DateTimeStamp) VALUES (@EmailAddress,@IPAddress,@DateTimeStamp)" TestDataSource.InsertParameters.Add("EmailAddress", emailAddressTextBox.Text) TestDataSource.InsertParameters.Add("IPAddress", Request.UserHostAddress.ToString())TestDataSource.InsertParameters.Add("DateTimeStamp", DateTime.Now()) Dim rowsAffected As Integer = 0
Try rowsAffected = TestDataSource.Insert() Catch ex As Exception Server.Transfer("test_problem.aspx")
Finally
TestDataSource = Nothing
End Try
If rowsAffected <> 1 ThenServer.Transfer("test_problem.aspx") ElseServer.Transfer("Test_Confirm.aspx")
I have website problems with mySQL data base, I have a website and was told the only way to change my password is within my SQL data base files and that it would be over my head buy the site maker, How can I do this ? I know my password now but it was changed today my the web site maker, I have had problems with someone blackmailing me for protecting of my site . all my members was delete lastnight I contacted the maker of the web site and was told that kind of thing is common. If anyone can help me tell me how I can change my password using what programs and or help files on the net I have dreamweaver 2004 already, thanks jim--Message posted via http://www.sqlmonster.com
I just have a simple question that can hopefully answer whether I should dive too deeply into Reporting/Analysis Services. I'm looking to run some clustering algorithms on user-inputted keywords on objects within my database (please see http://flickr.com/photos/tags/friends/clusters/ for an example of what I'm talking about). It seems to me that the various reporting methods that SQL Server provides would be an ideal, quick and easy way of providing this for my SQL Server 2005 database.
Can the reporting services provide this type of functionality? And if so, would this be scalable? I would want to be able to access this clustered data in much the same way I do queries across my database and would want them to be done quickly and efficiently.
[DTS.Pipeline] Error: "component "Excel Source" (1)" failed validation and returned validation status "VS_NEEDSNEWMETADATA".
and also this:
[Excel Source [1]] Warning: The external metadata column collection is out of synchronization with the data source columns. The column "Fiscal Week" needs to be updated in the external metadata column collection. The column "Fiscal Year" needs to be updated in the external metadata column collection. The column "1st level" needs to be added to the external metadata column collection. The column "2nd level" needs to be added to the external metadata column collection. The column "3rd level" needs to be added to the external metadata column collection. The "external metadata column "1st Level" (16745)" needs to be removed from the external metadata column collection. The "external metadata column "3rd Level" (16609)" needs to be removed from the external metadata column collection. The "external metadata column "2nd Level" (16272)" needs to be removed from the external metadata column collection.
I tried going data flow->excel connection->advanced editor for excel source-> input and output properties and tried to refresh the columns affected. It seems that somehow the 3 columns are not read in from the source file? ans alslo fiscal year, fiscal week is not set up up properly in my data destination? anyone faced such errors before?
RE: XML Data source .. Expression? Variable? Connection? Error: unable to read the XML data.
I want my XML Data source to be an expression as i will be looping through a directory of xml files.
I don't see the expression property or the connection property??
I tried setting the XMLData property to @[User::filename], but that results in:
Information: 0x40043006 at Load XML Files, DTS.Pipeline: Prepare for Execute phase is beginning. Error: 0xC02090D0 at Load XML Files, XML Source [108]: The component "XML Source" (108) was unable to read the XML data. Error: 0xC0047019 at Load XML Files, DTS.Pipeline: component "XML Source" (108) failed the prepare phase and returned error code 0xC02090D0. Information: 0x4004300B at Load XML Files, DTS.Pipeline: "component "OLE DB Destination" (341)" wrote 0 rows. Task failed: Load XML Files Information: 0xC002F30E at Bad, File System Task: File or directory "d:jcpxmlLoadjcp2.xml.bad" was deleted. Warning: 0x80019002 at Package: The Execution method succeeded, but the number of errors raised (2) reached the maximum allowed (1); resulting in failure. This occurs when the number of errors reaches the number specified in MaximumErrorCount. Change the MaximumErrorCount or fix the errors. SSIS package "Package.dtsx" finished: Failure. The program '[3312] Package.dtsx: DTS' has exited with code 0 (0x0).
I've question about how to handle structural datamodel changes in a datasource of PowerPivot. Suppose I'm developing a starmodel in SQL Server and sometimes a datatype changes or a name of a field changes in a table. It seems to me that PowerPivot handle this not gracefully as Analysis MD does (mostly). I received an error because of a wrong fieldname or even no error when a dattype changes in PowerPivot. Is this common or do I something wrong here. Does this mean that every time the datamodel changes the PowerPivot should be recreated? Or am I missing the clue here?
Was wondering if there was a best practice minimum permissions for creating a SQL login to use when setting up a new shared Data source for SSRS report manager?
Something along the lines of them being a data read for the DB and permissions to update tempdb?
Would have thought it not advisable to have the login be able to update the main db...
I have 4 Tablix and 2 of the Tablix get data from Server 1 and other 2 get the data from Server 2.I have set NoRowsMessage "=Data Not Available for the Selected Values"  for all the 4 Tablix.Now if data is not available from Server 1 then I must show "Data Not Available for the Selected Values" only once in the  outputbut now its appearing twice in the output because of the 2 tablix that had no rows.Similarly if data not available from Server 2 then it should show "Data Not Available for the Selected Values" only once in my output.If Data not avilable from all the Tablix then also i t should show only once as "Data Not Available for the Selected Values" in the report output.
A data reader is using a connection manager to connect to an ODBC System DSN . A query in the SqlCommand property is provided. Data is being truncated in the only string column . The data type in data reader output-->external columns shows as Unicode string [DT_WSTR] Length 7.
The truncated output in a text file is the first 3 characters from left to right . Changing the column order has no effect.
A linked server was created in SQL Server Management Studio to test the ODBC System DSN using the following:
Data returned using "OPENQUERY" does not truncate the string column indicating that the ODBC Driver returns data as expected with sql 2005, but not with the Data Reader.
I need to see inside a SSIS 2012 project a new SSIS installed component, but in the SSDT 2010 I cannot see the SSIS Data Flow Items tab for adding data source/data destination respect to the choose toolbox items pane.
Hi i am trying to do a straight forward load from a Flatfile source , i have defined the columns according to the lenghts defined in the Data Dictionary Provided but when i am trying to run the Task i am encounterring this error
The column data for column "Column 20" overflowed the disk I/O buffer.
I tried to add another column 21 at the end and truncate or leave that column unmapped to destination but the same problem occurs for column 21 what should i do to over come this .
In case of Bad Data how to clean up the source.. Please help me with this
I've got a report that is using a cube as a data source and I can't get the report to show all the data. Only data at the lowest level of the cube is displayed. The problem is that most of the data I'm concerned with is at higher levels. There's no problem with the MDX. I get the correct results when I run the query.
I'm using a table to show the results. I've also tried a matrix, but I get the same results. I'm using SSRS 2005 and SSAS 2000.
Anyone have experience with this? Am I missing something simple?
I am pretty new to SSIS. I am trying to create a package which can accept data in any of several formats. i.e. CSV, Excel, a SQL Server database/table and import the data into my destination database.
So far i've managed to get this working OK. However I am now TOTALLY stuck. I'm currently trying to just concentrate on the data sources being a CSV (using a Flat File Data Source) and/or an Excel Spreadsheet.
I can get the data in and to my destination using a UNION ALL component and mapping the data sources to it so long as both the CSV file and the Excel spreadsheet exist.
My problem is that I need my package to handle the possibility that only the CSV file might exist and there is no Excel spreadsheet. In which case i'd like the package to ignore the Excel datasource completely. Currently either of my data sources do not exist I get errors and the package terminates.
Is there any way in SSIS that I can check all my data sources to see which ones exist (i.e. are valid). If they exist I want to use them. If it doesn't exist i'd like to disgard it (without error - as long as there is a single datasource the package should run)
I've tried using the AcquireConnection method in a script task on each of my connections, hoping that it would error if the file/datasource did not exist. It doesn't though (in the case of an Excel datasource it just creates a empty excel file for me).
The only other option I can come up with are to have seperate packages depending on the type of data we want to import and then run a particular package depending on the format of the source data. This seems a bit long winded. I am pretty sure I must be able to do what I want to achieve but I can't work out how.
I'll be grateful to anyone who can send me any tips/hints/links on how I can achieve this.
I am new to SSIS programming, so bear with me if my question seems naive to you gurus. I have a situation that needs to set the data source for a data flow from external .NET application ('external' means that the application will run on different process than the SSIS). I am trying to set the data source on which the data flow works from my C# application in a DataSet format. Ideal solution is not to save the DataSet to any file on harddisk (I know that will work, but has the overhead of writing, reading and managing the temp file). What I want to achive is that the business logic of picking data for SSIS Data Flow to process is controlled inside my C# application, the Data Flow just does what it does best - Transformation. Have any of you successfully done this before?. Thanks!
I have a stored procedure that receives an input parameter of ReporID and then sends back the appropriate dataset to the data-driven subscription. In the stored procedure, there is some concatenated sql in producting the final select statement to send to the subscription, but it is generating the following error:
The dataset cannot be generated. An error occurred while connecting to a data source, or the query is not valid for the data source. (rsCannotPrepareQuery) Get Online Help
Insert Error: Column name or number of supplied values does not match table definition. Because there are some optional columns that need to conditionally appear based on the report, I can't avoid concatenation. My only other option is to have hundreds of individual tables to source all our reports, which I don't have time to manage.
Please let me know how I can allow the data-driven subscription setup process to discover the columns for my concatenated SQL so I can seutp my schedules based on the single stored procedure with all my logic nested in my table structure.
I'm new to sql server. Last week I made my first cube. Everything went fine. But when I want to explore the cube (from the explore panel in Visual Studio 2012) in Excel then I'm getting the error "Initialization of the data source failed".
Check your database server or contact your database administrator. Make sure the external database is available, and then try the operation again. If you see this message again, create a new datasource to connect to the database.
I'm using Sql Server 2012. Visual studio 2012. Excel 2013 on Windows 10
Hi, i'm wondering which is the best way to search data in a SQL Server. I reach data using Data Sources and Data Views and also with OLE DB Source with a Data access mode: Named query. I have to write the data into a Flat File. So, does any one knows which is the best practice for this? Or any one of the two are good choices? Thanks for your help.
I have created a SSIS package that transfer data from a Foxpro database to an instance of SQL Server 2005 Express. I used the wizard to create the package but I load and execute the package within a custom application that I have written in C#.
The way the custom application is intended to work is that the user can have the database in any location on the computer and all he has to do is specify the location then the application programatically changes the location of the source on the package that it has loaded and then execute it. When I initially run the package the first time (using the original path), it works fine and transfers the data. However, every subsequent time I run the application and specify a different path, the database on the SQL Server side gets created as expected but the data is not transfered!
Where am I going wrong? Do I need to save the package after I modify the source then reload and run it again or do i need to change something else in the Data Flow to make this work?
I am having a very wierd issue regarding a DB2 sql query. I need to get data from Db2 and insert into our sql server database. Using data flow task, to get data I am using the data reader source. COnnection is ado.netodbc connection. THis sql query also has some comments in it.
The first wierd thing is...
1. On Development server, when I run this query manually, meaning using toad, winsql (connection to the db2 database), the query runs fine. Brings back approx 667 rows which is correct. ON the same server when I try to run this query, via a SSIS pkg, data flow task, using data reader source, gives me error on those comments that exist in that query. But if I run the same SSIS pkg on another server (Integration server). It runs fine. The same pkg also runs fine if I run it from my machine. SO What is different on my Dev server compared to the Integration server.
2. Say if I take those comments out from the sql query, then try to run the ssis pkg. The query is stuck at the first record and goes in an infinite loop mode. though my query is not a procedure, it is just a sql statement. But this ssis pkg with the query runs absolutly fine on the other server. I aslo tried using the other types of connection and ole db source but still the same problem on the Dev server.
What do I need to look for that is so different on the dev server compare to the INT server. I also checked the version on both these server for Visual Studio 2005(by going to About Microsoft Visual Studio), it is the same.
This is what I have on both the servers.... Microsoft SQL Server Integration Services Designer Version 9.00.3042.00