Ive got SQL Server 7.0 running in one domain and i'm trying to access it
from a workstation belonging to another domain. There is no trust
between the domains but well it worked fine with SQL 6.5.
Ive tried several combi nations with out success. My best result is to
get an error message followed by a login box every time the application,
spreadsheet etc... tries to access its data source.
Works fine from workstations belonging to the domain (SQL)
Any ideas ?
Thanks
Jan
We are what we repeatedly do. Excellence, then, is not an act, but a
habit. -- Aristotle
I have a SQL express database which I need to access from a shared hosting plan. I can create an ODBC connection through the hosting provider's control panel for SQL Server, but it won't connect. I tested this locally and discovered that the SQL Native Client connects fine, but the previous SQL Server driver does not. This seems to only happen with SQL 2005 Express edition; it works with the Developer Edition. Does SQL Express only use the Native SQL Client??
I have this posted in the VFP section of the forums but the more I find out about the issue the more I think there is an SQL Server security issues.
What I am doing is trying to connect to a LinkedServer of my SQL Express 2005 database running on windows vista. I have the linked server set up to connect to a local FoxPro table using the VFPOLEDB provider. If I log in as the 'sa' account I am able to do my query. But if I log into the server using a trusted connection it doesn't work. I get the error [The OLE DB provider "VFPOLEDB" for linked server "sys" reported an error. The provider did not give any information about the error.]
So I went in and set the BUILTINUser login as the sysAdmin. I also made sure that all of the security settings in SQL server were set up the same between the trusted user and the 'sa' user.
My question is, What am I missing? Is there something that I need to set on windows or is there a setting in SQL server?
I don't know what category would be appropriate for this question but security seems to be close enough.
I have this case scenario: I am running an automated application that extracts data from a web site and stores the data into a table on SQL server 2005. This information is not confidential in the extreme of social insurance #'s, bank account #s, but should not be seen by a typical employee (it has no use for them). After the data has been stored, it retrieves the data from the same table, processes it, and updates the same table. This application runs every hour infinitely.
Should all the insert, update, and select queries be stored under a stored procedure? I am not concern with performance. My concern would fall under design and security.
Is it worth to hide the details of inserting/updating/selecting behind a stored procedure? Or should I just allow the program to send select/update/insert SQL queries?
No employee (other then the developer and the DB admin) or customer ever access this table (They do not have permission from SQL). The username and passwords were created with security in mind.
If I create a SYSTEM DSN ODBC for SQL Server 7.0 (3.70.09.61), will the password be encrypted in standard 128 bit encryption?
I want to make sure that it is safe to use ODBC for SQL Server. I found something on MS on Trace.... I am not sure whether it is applicable for ODBC 3 or more.
Any related docs or urls for this?
Any comments and suggestion are always welcome. -MAK
Recently converted a MS Access VB app to MS SQL SERVER 2005 Express with VB using ODBC for network connection to the DB. One of the users began receiving the following error "Run-time error '-2147467259 (80004005)' [Microsift][ODBC SQL Server]Cannot open database "dbname" requested by the login. The login failed".
When creating the SQL Express instance I took the defaults allowing Windows Login to the DB.
What I do not understand is that the user profile is "Domain User". When the user profile is switched to "Domain Admin" login is successful.
I create a sql user who is part of sysadmin, securityadmin, setupadmin and serveradmin roles.Â
When I try to connect through odbc using this user from other machines, it works fine. But if I remove it from sysadmin, I get an error message Connection Failed:Â
SQLState : '28000' SQL Server Error: 18456 [Microsoft][SQL Server Native Client 11.0][SQL Server][Login Failed for user:user1]
Hello, I have a package that contains an ADO.NET connection mananger using an ODBC provider. There is a user name and password configured in the connection mananger. The connection is to a DB2 database.
I install the package on SSIS by setting the package ProtectionLevel to ServerStorage. I then save the package to the server using FileSave Copy of (package name), and save it to the server. I also set the protection level there to use server storage.
When I enable package configurations (SQL Server), the package fails when run by a Job. In the log, I get the following error message from the DataReader source that uses the connection manager:
****** System.Data.Odbc.OdbcException: ERROR [08001] [IBM][CLI Driver] SQL30082N Security processing failed with reason "3" ("PASSWORD MISSING"). SQLSTATE=08001 at Microsoft.SqlServer.Dts.Runtime.Wrapper.IDTSConnectionManager90.AcquireConnection(Object pTransaction) at Microsoft.SqlServer.Dts.Pipeline.DataReaderSourceAdapter.AcquireConnections(Object transaction) at Microsoft.SqlServer.Dts.Pipeline.ManagedComponentHost.HostAcquireConnections(IDTSManagedComponentWrapper90 wrapper, Object transaction) **** The DataReader fails validation. When I disable package configurations, the Job executes the package without a problem. The server that the package runs on is 64 bit.
Why do I have this password problem when package configurations are enabled? The connection manager password is include in the package configurations.
I am using SSIS 2014 with the below .net framework version and installed in Windows server 2012 R2 . I have installed my client's odbc drivers (both 32 bit and 64 bit) in my production server and created ODBC system DSNs for 32 bit and 64 bit.
When i open SSIS 2014 and tried to create the odbc connection but i can able to see only the 32 bit system DSN connection ,i can't able to see my 64 bit odbc system dsn connection.
Microsoft Visual Studio 2012 Shell (Integrated) Version 11.0.50727.1 RTMREL Microsoft .NET Framework Version 4.5.51650
SQL Server Integration Services   Microsoft SQL Server Integration Services Designer Version 12.0.1524.0
And i installed my client odbc drivers(32,64 bit) and created ODBC system DSNs in my local system and when i open ssis 2014 and i can able to see both the ODBC system DSNS(32,64) connections from SSIS ODBC connection.
I am using below version of .net framework in my local system which was installed in windows 7 and i have SSIS 2012 also installed in my system and i can able to see both ODBC connections using 2012 as well in my local system.
Microsoft Visual Studio 2012 Shell (Integrated) Version 11.0.50727.1 RTMREL Microsoft .NET Framework Version 4.5.50938
SQL Server Integration Services   Microsoft SQL Server Integration Services Designer Version 12.0.1524.0
why i can not see the ODBC 64 bit system DSN connection from SSIS in my production server ?
I am using VB.NET 2005 and set up an ODBC connection via ODBC.ODBCConnection to a MDB database. Therefor, I use the "Microsoft Access ODBC Driver (*.mdb)".
When I set up a ODBCCommand like "ALTER DATABASE..." or "CREATE TABLE..." and issue it with the com.ExecuteNonQuery() command, I get an error from ODBC driver, that a SQL statement has to begin with SELECT, INSERT, UPDATE or DELETE.
How can I use DDL statements via ODBC?
I would appreciate if you could help me to use ODBC for that - no OLE, no ADO.
I apologize if this is not the correct forum for this posting. Looking at the descriptions, it appeared to be the best choice.
I am running Windows XP Pro SP2. I have installed the SQL Native Client for XP. However, when I try to add a new data source through ODBC Connection Manager, SQL Native Client is not listed as an option. I have followed this procedure on three other systems with no problems. What would be causing the SQL Native Client to not show up in the list of available ODBC data sources?
I'm sorry to be ignorant on this point. It seems trivial, but what's the difference between @@ and @ when using variables in T-SQL? I have a developer that always uses @@ for local variables and @ for reference variables (meaning variables declared as parameters for a stored procedure or function).
Is that purely stylistic? Is it a holdover from some previous version? Or is it a legitimate best practice that I've not seen before?
My google-shui is weak today; I found nothing when searching.
I have posted this issue for a week, haven't got any reply yet, I posted it again and desperately need your help.
The article http://msdn2.microsoft.com/en-us/library/ms365343.aspx says: Model Item Security can be set for differnt security filters, but when I use SQL Server Management Studio to set Model Item Security, it seems "Permissions" property surpass "Model Item Security" property. -- My report server is using Custom Authentication.
For example, in "Permissions" property of the model, if I checked "Use these roles for each group or user account" without setting any user or group, no matter what users I added to "Model Item Security" with "Secure individual model items independently for this model" checked, NO one user can see the model on report manager and report builder;
in above situation, if I added "user1" and gave role such as "Browser" role to "user1" in "Permissions" property, if I checked "Secure individual model items independently for this model" in "Model Item Security" property, even I did NOT grant "user1" to root model and any entities under the model, the "user1" is able to access the model and all entities in report builder.
My question is on the same report model, how to set "AdminFilter" (empty security filter) for administrator permissions and set "GeneralFilter" (filtered on UserID) for general user based on their UserID?
The article also says:
"Security filters are always applied, even for users who have Content Manager or Administrator permissions to the model. To allow administrators or other users to see all rows of an entity on which row-level security is defined, you can create an empty security filter (which always returns True) and then use the filter to grant those users access to all the rows."
So I defined 2 filters "GeneralFilter" and "AdminFilter" for "Staff" entity for my report model "SSRSModel", I expect after I deployed the report model, the administrator users use report builder to build reports with all rows available, and the non-admin users can only see rows based on their UserID.
I can only get one result at a time but not both:
either the rows are filtered or not filtered at all, no matter how I set the "SecurityFilter" for the entity: I tried setting both "AdminFilter" and "GeneralFilter" for SecurityFilter at the same time, combination of "DefaultSecurityFilter" and "SecurityFilter", or one at a time.
Hi all,I am having trouble getting linked Oracle 9 server in MS SQL Server2005 Express to work properly. My machine is running Windows XP.The Microsoft and Oracle OLE DB Providers have problems dealing withOracle's Numeric Data Type, so I decided to use Microsoft's OLE DB forODBC Provider and an Oracle ODBC source. When using the Microsoft ODBCfor Oracle Driver in my ODBC source I have inconsistent behavior.Sometimes my queries are processed properly, then other times I get thefollowing errorOLE DB provider "MSDASQL" for linked server "ODBCBEAST" returnedmessage "[Microsoft][ODBC Driver Manager] Driver's SQLSetConnectAttrfailed".OLE DB provider "MSDASQL" for linked server "ODBCBEAST" returnedmessage "[Microsoft][ODBC Driver Manager] Driver's SQLSetConnectAttrfailed".OLE DB provider "MSDASQL" for linked server "ODBCBEAST" returnedmessage "[Microsoft][ODBC driver for Oracle][Oracle]".Msg 7303, Level 16, State 1, Line 1Cannot initialize the data source object of OLE DB provider "MSDASQL"for linked server "ODBCBEAST".I have no idea why sometimes I can connect to the linked server with noproblems andwhy other times it performs like this. I'm not changing anything aboutthe system I can think of. When I use an Oracle client (PL/SQL) I haveabsolutely no problems connecting. TNSPING returns that the connectionis good.This is unacceptable so I decided to try my luck with the Oracle 10gODBC driver. However when I use this and perform an openquery selectagainst the linked server I get back only 11 rows, when I know that thedatabase has over 100 rows (in fact when using the Microsoft ODBCdriver and it works that's what I get). I figured maybe the buffersetting needed to be raised in the ODBC configuration so I took it from64000 to 600000 (a magnitude of 10) but I still get back only 11 rows.I'm at my wit's end.Any suggestions on resolving one or the other problem would be muchappreciated.Thanks much
Both these tables contain considerable amounts of rows, but over time tableA will end up containing orphaned values (i.e. the a_id is not used in tableB) and this problem cannot be rectified by setting, for example, cascade deletes.
To fix this problem I decided to write a simple stored procedure to purge all values in tableA where its a_id is not used in tableB :
DELETE FROM tableA WHERE a_id NOT IN (SELECT a_id FROM tableB)
Now although the following document relates to postgres :
I wonder if anyone else out there has the same impression that I have: I find that DTS works much better than SSIS.
I find that DTS is so easy to use and reliable: it gets the job done and fast! On the other hand, SSIS seems to be so needlessly complex that it takes hours of troubleshooting just to get it to work, and sometimes it doesn't work at all. For example I have just spent hours trying to get SSIS to import a flat file with 300,000 rows. It just crashes and doesn't even give an error message so that one can fix it. On the other hand I have just now successfully accomplished the same task with DTS and it took me 5 minutes!
I honestly don't see a valid reason for using SQL Server 2005 instead of 2000. So far it's much more productive to use 2000.
Hi All, Any suggestions / views / help on below question would be welcomed. I am building an asp.net 2.0 application with sql 2005 express as back end. My back end has 3 major tables which are: tblArticles - saves basic info on articles posted by user (like articleid, title, short desc, rating, views, etc) tblCategories - saves various categories and their hierarchies (id, parented, name, etc) tblArticleCategories - saves info on which articles fall in which categories (like articleid, categoryid) as of now, i am caching all rows from the first 2 tables, but i am in a bit of doubt for caching the third table (tblArticleCategories), although data in this table wont change very often and also this table will just have 2 columns and not many rows as well and this is a good target for caching, but the reason I am in a bit of doubt to cache this table is, when my website visitor clicks on any category link in the category tree view, I need to use an inner join across all these 3 tables to locate and return all articles found in that particular category. But I can do the same thing without hitting the database as I already have 2 of the required 3 tables in my cache, I can simply add the third table to my cache and then using the dataview objects rowfilter property on these 3 cached tables, I can very well get the appropriate results. But I wonder which of the 2 methods would you prefer and suggest, I mean do you feel that just to save hits against the database, I am going to far and doing a lot of crap using the dataview (which might not be as efficient as sql engine) or you feel that the inefficiency of the dataview will still win compared to the cost of hitting the database for this Thanks in advance, bye take care Raj Chaudhari, Mumbai, India (MCAD.NET) www.xtremebiz.biz
Hello all, I have table 'statistics' which holds information about another table, i.e. number of rows belonging to each user. Would I be better off using a trigger after each insert to increment a certain row. Or would I be better off selecting the data by means of an sql statement and updating the column whenever the statisitcs page is requested. Does sql provide any methods which allow a column to count other rows or columns?
Anyone here ever used the Informix database and can give me some differences between Informix and SQL.
One of our users is thinking about purchasing a COTS product that only supports an Informix database. I need to convince the user to evaluate other rival applications that can support SQL and need some arguments in favor of not going with Informix.
We currently use CA ArcServe (ArcServe 6.5 Enterprise and Single Server Editions) to backup our Windows NT files and MS-SQL Server databases. We have experienced significant reliability issues with ArcServe. Many times we have found ourselves rebuilding a corrupt ArcServe Job (ArcServe’s backup schedule) database. One of our NT server occasionally NT bug checks when ArcServe is performing backups. Occasionally ArcServe Jobs incorrectly reschedule themselves. Sometimes the Jobs do not complete but stay executing, not performing any work, and to cancel them may require a lot of effort. The ArcServe job DB repair utility generally does not work. The user interface is lacking. For example, the job scheduling options are very limited. CA tech support for this product has been poor. Because we have issues with ArcServe stability we are now evaluating Veritas (formally Seagate) Backup Exec for NT. What are other people’s experiences with these 2 products?
I gotta network tech that I work with from time to time. Hes gonna migrate a access database over to sql. He says it should be easy its a flat file can just do it through enterprise manager. I warned him that datatypes can become an issue (kinda have to know your db) he looked at me like I'm an idiot and proceeded to migrate the tables over to sql...Needless to say he got alot of error messages and is now totally confused. Now let me ask some experts who really Know Databases, do you ever have problems with Network Techs who think they know all
We are planning hardware purchases (more is better). One of our databases is 131 gigs in size and has 45 gigs of 'space available'. I'm not a very experienced SQL Server person, but this seems like quite a bit of 'space available'
1) Is there a way to regulate the amount of 'space available'? 2) are there any rules of thumb for how 'space available' there should be?
we are about to purchase new database servers and have been offered a good deal on 64-bit Xeon machines. At present we run SQL 2000 on Windows Server 2003 both of which are 32-bit versions.
Is there any problem using our current 32-bit Server software on the 64-bit machines (apart from not being able to utilise its full power)? I'm assuming the SQL 2005 licenses are the same price regardless of 32-bit or 64-bit version. If we buy a 64-bit SQL Server version license are we going to get the best out of it on a 32-bit Windows Server edition?
I have always been told that Cursors create a lot of overhead and consume a lot of system resources. Is it faster to store the data in a temp table and loop through it by using Select Top 1 and Delete statements or by using a static, Forward-Only Cursor? Both ways store the data in TempDB, but doesn't the While Loop statement generate more IO's than the Cursor? In theory, I am thinking that the Cursor is better. Any info will be appreciated.
I have table with a field defined as nvarchar. I want to change it to varchar. I have a stored procedure which defines the parameter @strCall_desc as nvarchar(4000). Are there going to be ay problems with running this sp if I just change the field type as described.
I have a database that is being used as sort of a reports datawarehouse. I use DTS packages to upload data from all the differentsources. Right now I have it truncating the tables and appending withfresh data. I was considering using updates instead and my question waswhich is more efficent?