Showing posts with label application. Show all posts
Showing posts with label application. Show all posts

Friday, March 30, 2012

MultiUsers Question

We had a debate on designing a database .The use case is;

an application that we will rent to other companies from our own datacenter on the web.They will all use the same database schema.We found 3 alternatives for our solution.But we are not sure which one is the best.

We are expecting 20.000 insurance agencies to use the database composed of 33 tables.And they will all have their unique data but sometimes for reporting we might union their data.

ALTERNATIVES

1.Make 20.000 database instances.Every agency will have their own database instance.

2.Make one database instance , put a database ownerID(int) field to every table.And use appropiate indexes.*or table partioning

3.Make one database instance , and create 33 tables for every database owner.So total number of tables in the datase equals to numberofUsers X 33

Did anyone tested this situation?

*an article I read on the web said partining is allowed to 1000 per table and it is slow in SQL 2005.(I am confused)

http://www.sqljunkies.com/Article/F4920050-6C63-4109-93FF-C2B7EB0A5835.scuk

In common, I am a fan of putting all the data together in one database. based on the information that you provided it is hard to decide wheter you need separated database / instances or not.

There are several things which you should consider:

-Organisational structure (You could also split the database up in regions / geographical or country based)
-Maintainance (Due to different timezones / work loads on the databases you might have different maintainance windows to do something)
-Load balancing
-Scaling
-Availability
-Estimated workload

etc.

I think that we both agree that you wouldn′t put 20000 dbs on on server, or create thousends of instances (the limit is 25 for one computer)

HTH, Jens Suessmeyer.

http://www.sqlserver2005.de

sql

MultiUsers Question

We had a debate on designing a database .The use case is;

an application that we will rent to other companies from our own datacenter on theweb.They will all use thesame database schema.We found 3 alternatives for our solution.But we are not sure which one is the best.

We are expecting 20.000 insurance agencies to use the database composed of 33 tables.And they will all have their unique data but sometimes for reporting we might union their data.

ALTERNATIVES

1.Make 20.000 database instances.Every agency will have their own database instance.

2.Make one database instance , put a database ownerID(int) field to every table.And use appropiate indexes.*or table partioning

3.Make one database instance , and create 33 tables for every database owner.So total number of tables in the datase equals to numberofUsers X 33

Did anyone tested this situation?

*an article I read on the web said partining is allowed to 1000 per table and it is slow in SQL 2005.(I am confused)

http://www.sqljunkies.com/Article/F4920050-6C63-4109-93FF-C2B7EB0A5835.scuk

I would choose #1 possibly if it would be important for backup purposes to keep the data "separate". Cross reporting would be difficult, but not totally impossible -- if you can work from a "snapshot". It would be easiest to run a process to pull the data you need from all the agencies into a singular table structure just for the report(s), that has the added OwnerID column. You could do this either as some nightly process, or place a trigger on each of the tables that need to be reported on to replicate the changes to the common table. It would be a completely redundant dataset, so the backup of it isn't all that important, other than the table structure, and any special views/stored procedures/extra tables you may have in there.

This approach also makes it a lot easier to insure that a mistake isn't made whereby another agencies data isn't presented to a different agency. If you have a DBA (Which I'm guessing you do not, or you are the DBA), then they should be able to provide you with views and stored procedures that the application programmers are allowed to use to access the data and hence shouldn't be able to get themselves into trouble.

This type of setup is a bit more complicated to set up, but, it does effectively partition the tables for cross-agency reports.

4. Make one database instance, put a database ownerID(int) field in one (or more) of the tables. ID's are global (across agencies), so that a ownerID is redundant in tables that use an ID (or GUID) as it's primary key that is created in a different table.

|||hmm I am sorry but I did not understand how your 4 th approach differs from the 2 nd approach?|||

First 20,000 insurance agencies is not many, if you take them to be users, it means you have 20,000 users. In the US insurance is divided into Property and Casuality for home and cars policies and Life and Health. So just put them into the type of policy sold or by location and it becomes just 20,000 users. I have told you before you need a DDL(data definition language) expert.

If you don't have it installed check the Pubs sample database for ideas if you decide to use location to separate them because car insurance is sold by state drivers license. Hope this helps.

|||

The 4th approach differs from the 2nd in that if you have say these two tables:

Users->UserID,Name, etc.
Policies->PolicyID(IDENTITY),OwnerID(a UserID),SalesmanID(a UserID),PolicyTypeID, etc.
PolicyRider->PolicyID,RiderTypeID,etc.

In approach 2, the tables would look like this:

Users->AgencyID,UserID,Name, etc.
Policies->AgencyID,PolicyID,OwnerID,SalesmanID,PolicyTypeID,etc
PolicyRider->AgencyID,PolicyID,RiderTypeID,etc.

Where in approach 4, the tables would look like this if you allow any user to belong to multiple Agencies:

Users->UserID,Name,etc
UserAgencies->UserID,AgencyID
Policies->AgencyID,PolicyID, ...
PolicyRider->PolicyID,RiderTypeID,etc (NOTE No AgencyID, because the PolicyRider record has a reference to the Policies table entry which already contains a AgencyID)

Or like this, if no Users can be in multiple agencies:

Users->AgencyID,UserID, ...
Policies->PolicyID, ... (NOTE No AgencyID because each policy is owned by a user, and each user can only be in a single agency, this policy must belong to the agency the user is in)
PolicyRider->PolicyRider->PolicyID,RiderTypeID,etc (NOTE No AgencyID, because the PolicyRider record has a reference to the Policies table entry which has a reference to a Users table entry which contains a AgencyID)

You may want to attach the AgencyID to the Policy instead of the User, in these cases, but the idea is that so long as the "parent" of any relationship already has an AgencyID, the children do not, since you can imply their AgencyID by referencing their (singular) parent's AgencyID.

|||

if no Users can be in multiple agencies:

Users->AgencyID,UserID, ...
Policies->PolicyID, ... (NOTE No AgencyID because each policy is owned by a user, and each user can only be in a single agency, this policy must belong to the agency the user is in)

>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>

I have read for 4 times.But did not understand how will I know that a record is owned by that AGENCY or by that USER.

maybe it is too late and I am really tired...is there anyuserID on the Policies table?


PS:I really do wanna implement a simple accountability pattern for the user system but performant.So the users can be in agencies but I will define that by simple attributeTypes in database not using the relations...

|||Sorry, yes. I was assuming that the Policies table would also have a OwnerID field (that's a UserID) or an AgentID (also a UserID), or some other field that you would use that to tie the record back to a particular user which is then tied to an AgencyID.

Multi-user issues with SSRS2000

Hi,

At our company we are looking at implementing SQL Server 2000 Reporting Services into our internet application. Having followed your book, and produced a dozen reports, I have found a problem with calling reports in a multi-user environment.

Some of the reports that our customers want can run to over 100 pages which, whilst a large number, is a requirement. I have found that when two people call one of these large reports at the same time; one of the users gets the error "Login failed for user '(null)'. Reason: Not associated with a trusted SQL Server connection". Once the worker process begins to render the report on the first user's screen, the report may be refreshed on the second users screen and it renders quite normally. Under normal circumstances the login credentials are valid and identical for both users.

Furthermore, an error 'Server Application Unavailable' is often generated when a number of users are requesting large reports this appears to be caused by the worker process being recycled.

Our production servers are a mixture of Windows 2000 Server (SP4) and Windows 2003 Server Standard Edition (SP1), the version of SQL Server and Report services is 2000 Enterprise Edition, using the latest service packs (SQL Server is SP4, and report services is SP2).

Can you suggest how we can get around these errors.

Regards,

Martin

Because your large report is consuming so much memory during the rendering process, your worker process is probably recycling because it's hitting the max MemoryLimit as defined in machine.config.

The "Login Failed" error message that you mentioned is a symptom of your worker process recycling, coming back up, and then trying to request a report on your behalf...however, because it cycled, it lost your user context (credentials), ergo the "Null" user...

The article below discusses dealing with memory limits / issues with large reports.

http://support.microsoft.com/default.aspx?scid=kb;EN-US;909678

sql

Wednesday, March 28, 2012

Multi-user issue

I have a medium sized application running on Windows Mobile 5.0 devices.
This app was developed in VS 2005 and uses the Microsoft application blocks
for data access - running a stored proc. This app has been in use for
several years and works well. I've added a class for the unloading of
containers from our overseas operations. This app works well when one user
is scanning cases, but additional users keep getting kicked out. The
original user never has any trouble. When monitoring the SQL profiler I see
the "Audit Logout" message 7 or 8 times each time a user is kicked out. All
other functionality is working fine, we also have nearly 100 pc's running .
Net apps against this database - all with no problems. This one class/stored
proc is the only one giving trouble. The only real difference with this one
is that I am using the "SaveNoTransaction" method instead of the "Save"
method in the application blocks. I've looked at the code in this method and
see no problem there either. Any suggestions? I'm running out of ideas.
Thanks
Hi
"RickBentley" wrote:

> I have a medium sized application running on Windows Mobile 5.0 devices.
> This app was developed in VS 2005 and uses the Microsoft application blocks
> for data access - running a stored proc. This app has been in use for
> several years and works well. I've added a class for the unloading of
> containers from our overseas operations. This app works well when one user
> is scanning cases, but additional users keep getting kicked out. The
> original user never has any trouble. When monitoring the SQL profiler I see
> the "Audit Logout" message 7 or 8 times each time a user is kicked out. All
> other functionality is working fine, we also have nearly 100 pc's running .
> Net apps against this database - all with no problems. This one class/stored
> proc is the only one giving trouble. The only real difference with this one
> is that I am using the "SaveNoTransaction" method instead of the "Save"
> method in the application blocks. I've looked at the code in this method and
> see no problem there either. Any suggestions? I'm running out of ideas.
> Thanks
>
With the audit logouts I would expect that the connection is deliberately
being closed. Have you got try-catch blocks in the code to trap errors? Have
you also looked at the Errors and Warnings exceptions in Profiler?
John

Multi-user issue

I have a medium sized application running on Windows Mobile 5.0 devices.
This app was developed in VS 2005 and uses the Microsoft application blocks
for data access - running a stored proc. This app has been in use for
several years and works well. I've added a class for the unloading of
containers from our overseas operations. This app works well when one user
is scanning cases, but additional users keep getting kicked out. The
original user never has any trouble. When monitoring the SQL profiler I see
the "Audit Logout" message 7 or 8 times each time a user is kicked out. All
other functionality is working fine, we also have nearly 100 pc's running .
Net apps against this database - all with no problems. This one class/stored
proc is the only one giving trouble. The only real difference with this one
is that I am using the "SaveNoTransaction" method instead of the "Save"
method in the application blocks. I've looked at the code in this method and
see no problem there either. Any suggestions? I'm running out of ideas.
ThanksHi
"RickBentley" wrote:
> I have a medium sized application running on Windows Mobile 5.0 devices.
> This app was developed in VS 2005 and uses the Microsoft application blocks
> for data access - running a stored proc. This app has been in use for
> several years and works well. I've added a class for the unloading of
> containers from our overseas operations. This app works well when one user
> is scanning cases, but additional users keep getting kicked out. The
> original user never has any trouble. When monitoring the SQL profiler I see
> the "Audit Logout" message 7 or 8 times each time a user is kicked out. All
> other functionality is working fine, we also have nearly 100 pc's running .
> Net apps against this database - all with no problems. This one class/stored
> proc is the only one giving trouble. The only real difference with this one
> is that I am using the "SaveNoTransaction" method instead of the "Save"
> method in the application blocks. I've looked at the code in this method and
> see no problem there either. Any suggestions? I'm running out of ideas.
> Thanks
>
With the audit logouts I would expect that the connection is deliberately
being closed. Have you got try-catch blocks in the code to trap errors? Have
you also looked at the Errors and Warnings exceptions in Profiler?
John

Multi-user issue

I have a medium sized application running on Windows Mobile 5.0 devices.
This app was developed in VS 2005 and uses the Microsoft application blocks
for data access - running a stored proc. This app has been in use for
several years and works well. I've added a class for the unloading of
containers from our overseas operations. This app works well when one user
is scanning cases, but additional users keep getting kicked out. The
original user never has any trouble. When monitoring the SQL profiler I see
the "Audit Logout" message 7 or 8 times each time a user is kicked out. All
other functionality is working fine, we also have nearly 100 pc's running .
Net apps against this database - all with no problems. This one class/store
d
proc is the only one giving trouble. The only real difference with this one
is that I am using the "SaveNoTransaction" method instead of the "Save"
method in the application blocks. I've looked at the code in this method an
d
see no problem there either. Any suggestions? I'm running out of ideas.
ThanksHi
"RickBentley" wrote:

> I have a medium sized application running on Windows Mobile 5.0 devices.
> This app was developed in VS 2005 and uses the Microsoft application block
s
> for data access - running a stored proc. This app has been in use for
> several years and works well. I've added a class for the unloading of
> containers from our overseas operations. This app works well when one use
r
> is scanning cases, but additional users keep getting kicked out. The
> original user never has any trouble. When monitoring the SQL profiler I s
ee
> the "Audit Logout" message 7 or 8 times each time a user is kicked out. A
ll
> other functionality is working fine, we also have nearly 100 pc's running
.
> Net apps against this database - all with no problems. This one class/sto
red
> proc is the only one giving trouble. The only real difference with this o
ne
> is that I am using the "SaveNoTransaction" method instead of the "Save"
> method in the application blocks. I've looked at the code in this method
and
> see no problem there either. Any suggestions? I'm running out of ideas.
> Thanks
>
With the audit logouts I would expect that the connection is deliberately
being closed. Have you got try-catch blocks in the code to trap errors? Have
you also looked at the Errors and Warnings exceptions in Profiler?
John

Multi-user errors

Hi,
At our company we are looking at implementing SQL Server 2000 Reporting
Services into our internet application. Having followed your book, and
produced a dozen reports, I have found a problem with calling reports
in a multi-user environment.
Some of the reports that our customers want can run to over 100 pages
which, whilst a large number, is a requirement. I have found that when
two people call one of these large reports at the same time; one of the
users gets the error "Login failed for user '(null)'. Reason: Not
associated with a trusted SQL Server connection". Once the worker
process begins to render the report on the first user's screen, the
report may be refreshed on the second users screen and it renders quite
normally. Under normal circumstances the login credentials are valid
and identical for both users.
Furthermore, an error 'Server Application Unavailable' is often
generated when a number of users are requesting large reports this
appears to be caused by the worker process being recycled.
Our production servers are a mixture of Windows 2000 Server (SP4) and
Windows 2003 Server Standard Edition (SP1), the version of SQL Server
and Report services is 2000 Enterprise Edition, using the latest
service packs (SQL Server is SP4, and report services is SP2).
Can anyone suggest how we can get around these errors.
Regards,
MartinMartin,
It sounds like the problems occur during the processing/collecting of
data and that once that finishes and the reports begin rendering the
conflicts resolve. The first thing that comes to mind is rendering
these reports from snapshot data. Using a snapshot, you schedule the
query that generates your large dataset to run during off peak hours.
The reports can then be rendered immediately from the (already)
collected data rather than both collecting and rendering. This of
course assumes that you don't need up-to-the-moment i.e. realtime data
for the reports otherwise snapshots may not be an option for you.
Another option might be using subscriptions to distribute (push)
multiple copies of the reports rendered from one run of the process.
This might be the way to go if this an instance of one report being
called (pulled) by several users. Again, this assumes relatively
static data.
HTH
toolman
MartinClayton wrote:
> Hi,
> At our company we are looking at implementing SQL Server 2000 Reporting
> Services into our internet application. Having followed your book, and
> produced a dozen reports, I have found a problem with calling reports
> in a multi-user environment.
> Some of the reports that our customers want can run to over 100 pages
> which, whilst a large number, is a requirement. I have found that when
> two people call one of these large reports at the same time; one of the
> users gets the error "Login failed for user '(null)'. Reason: Not
> associated with a trusted SQL Server connection". Once the worker
> process begins to render the report on the first user's screen, the
> report may be refreshed on the second users screen and it renders quite
> normally. Under normal circumstances the login credentials are valid
> and identical for both users.
> Furthermore, an error 'Server Application Unavailable' is often
> generated when a number of users are requesting large reports this
> appears to be caused by the worker process being recycled.
> Our production servers are a mixture of Windows 2000 Server (SP4) and
> Windows 2003 Server Standard Edition (SP1), the version of SQL Server
> and Report services is 2000 Enterprise Edition, using the latest
> service packs (SQL Server is SP4, and report services is SP2).
> Can anyone suggest how we can get around these errors.
> Regards,
> Martin

Multiuser Databinding in SQL Server 2000?

Hi

I am developing one multi user application in windows environment using VC++ .NET with SQL Server 2000. How Can I create dataset for multi user data updating in my SQL Server 2000. Where I can get help and sample code?

Thanks

Jose

Jose,

The DataSet is an off-line data cache. The DataAdapter is used to submit the pending changes stored in the DataSet. In the DataAdapter's updating logic, you can specify your own concurrency options to control whether you want to use a "last in wins" approach with only the primary key column(s) in the WHERE clause, or use other columns in the WHERE clause to ensure you don't overwrite changes made by another user.

I hope this information proves helpful.

David Sceppa
ADO.NET Program Manager
Microsoft|||Hi

Thankyou for the nice explanation. One more question how can I write custom primary key value. for example (50710000001) like this. 5 is for year, 07 is for month, 10 is for place, and remaining is unique automatic generating number. How can I write program in SQL Server 2000.

Regards,
Jose

Multithreading?

Hi

Can an application populate two Grids from the database,
simultaneously without using the multithreading using ADO.NET

Thanks in advance
MR_FBI
mr_fbi2020@.yahoo.comMr_FBI (mr_fbi2020@.yahoo.com) writes:
> Can an application populate two Grids from the database,
> simultaneously without using the multithreading using ADO.NET

My guess is that you can't.

From a database perspective, you need two different connections, and that's
not a big deal. ADO .Net can handle that. Then I don't know about data
binding and all that, but since calls in ADO .Net are synchronous, I can't
see how you could fill the grids without multiple threads.

But if you want to know for sure, you should find a forum devoted to
..Net programming.

--
Erland Sommarskog, SQL Server MVP, sommar@.algonet.se

Books Online for SQL Server SP3 at
http://www.microsoft.com/sql/techin.../2000/books.asp|||Erland Sommarskog <sommar@.algonet.se> wrote in message news:<Xns94992E388188Yazorman@.127.0.0.1>...
> Mr_FBI (mr_fbi2020@.yahoo.com) writes:
> > Can an application populate two Grids from the database,
> > simultaneously without using the multithreading using ADO.NET
>
> My guess is that you can't.
> From a database perspective, you need two different connections, and that's
> not a big deal. ADO .Net can handle that. Then I don't know about data
> binding and all that, but since calls in ADO .Net are synchronous, I can't
> see how you could fill the grids without multiple threads.
> But if you want to know for sure, you should find a forum devoted to
> .Net programming.

Thank you, I think you are right. As soon as I get my VS.NET 2003, I
will try myself. :)
Best Regards|||Yes. You can call a stored procedure that returns 2 resultsets. Just fill the
dataset and then bind each table to each datagrid. It's simply a single call
without multi calls/threads.

--
-oj
http://www.rac4sql.net

"Mr_FBI" <mr_fbi2020@.yahoo.com> wrote in message
news:861074e2.0402231132.73707b3b@.posting.google.c om...
> Hi
> Can an application populate two Grids from the database,
> simultaneously without using the multithreading using ADO.NET
> Thanks in advance
> MR_FBI
> mr_fbi2020@.yahoo.com

Multithreading SMO

I have been experimenting with multithreading the SMO database objects to increase performance in my application but i never seem to beable to push the cpu load of the system above 25% (4 processor server).

Has anyone successfully been multithreading these objects?

Niklas,

SMO can be used in multithreaded scenario, but improvements in performance highly depend on the design on your application. SMO applications are usually not CPU-bound, and the best strategy is usually to optmize the number of queries to the SQL Server SMO has to perform.

Here is a couple of great articles written by one of SMO architects Michiel Wories:

http://blogs.msdn.com/mwories/archive/2005/05/02/smoperf1.aspx

http://blogs.msdn.com/mwories/archive/2005/05/02/smoperf2.aspx

Multithreading SMO

I have been experimenting with multithreading the SMO database objects to increase performance in my application but i never seem to beable to push the cpu load of the system above 25% (4 processor server).

Has anyone successfully been multithreading these objects?

Niklas,

SMO can be used in multithreaded scenario, but improvements in performance highly depend on the design on your application. SMO applications are usually not CPU-bound, and the best strategy is usually to optmize the number of queries to the SQL Server SMO has to perform.

Here is a couple of great articles written by one of SMO architects Michiel Wories:

http://blogs.msdn.com/mwories/archive/2005/05/02/smoperf1.aspx

http://blogs.msdn.com/mwories/archive/2005/05/02/smoperf2.aspx

Multithreaded run of SSIS packages

Is it safe to run several SSIS packages in parallel in a multithreaded application? The package objects are not accessed from multiple threads: every thread is handling its own package (see below).

void task_inside_thread()
{
System::String ^s = "...";
Microsoft::SqlServer::Dts::Runtime::Package p;
p.LoadFromXML(s, nullptr);

// callback handler
MyEventsClass ^ eventsClass = gcnew MyEventsClass();

DTSExecResult result = p.Execute(nullptr, nullptr, eventsClass, nullptr, nullptr);
return;

}

Is there any relevant documentation with respect to this issue?

Thanks,

Bogdan

Yes, this should be safe, packages are independent of each other.|||

I worried more about the fact the package instances might use global objects for registration and for signalling their current status.

Thanks for your answer.

Bogdan

|||Yes, they sometime do. I have to re-phrase it - they behave as independent objects, taking care of locking global state when needed.

Wednesday, March 21, 2012

Multiple versions of SQL Server 2005 - What order to install?

I currently have three SQL Server instances installed on my notebook. One is v8, which I believe is used by an accounting application. The other two are as follows:-

SQL Server MSSMLBIZ

SQL Server SQLEXPRESS

The second one (SQLEXPRESS) fails to start generating the following error:-

The SQL Server (SQLEXPRESS) service terminated with service-specific error 17058 (0x42A2).

Other services listed in SQL Server Configuration Manager are:-

SQL Server Integration Services - which runs apparently normally.

SQL Server Analysis Services (MSSQLSERVER)

SQL Server Reporting Services (MSSQLSERVER)

SQL Server Browser

The last thre of these are run under LocalSystem.

I have been trying to install the SQLServer SP2 to the SQLEXPRESS instance for some number of days now, but each time it runs, it fails to install.

I am beginning to wonder if there is a sequence of installation for each of the above instances that should be followed. Can anyone give me any suggestions, or answers to the above problems?

Thanks,

It may be that the SQL Server Express installation is not using a 'good' login account.

Go to [Control Panel], [Admin Tools], [Services]. Find the SQLExpress instance, and double-click, then select the [Log On] tab. Select 'Local System Account', and start the service.

If it does not start, you need to uninstall and then re-install SQL Express.

If it does start up. refer to these articles about properly setting up a service account.

Configuration -Service Accounts, SQL Server 2005 - Setting Up Windows Service Accounts
http://msdn2.microsoft.com/en-us/library/ms143691.aspx
http://msdn2.microsoft.com/en-us/library/ms143504.aspx

|||

You might try changing the problem service to start by using the local system...

Monday, March 19, 2012

Multiple users problem

Hi, I am building a web application now that will be used by multiple users. Users import data from a CSV into table 'temp' which is later converted to the right data type and then moved to another table, but I was wondering what would happen if more than 1 user was importing different files at the same time.

Will this create a problem, if so how can I fix this?

Thanks for the help.

From the perspective of the SQL database, it won't care if multiple people are uploading at the same time. You might however.

The simple solution (especially if this is a multi-step process) is to ensure that in your shared temp table you insert the UserID into each row. Then as you step through your processes you just make sure that the same UserID is in each row, that way users cannot trip over/ corrupt each others' data at import time.

This assumes that the CSV file names are different. If you require all people to name their file 'Import.csv' then of course you will either have a file lock issue (can't upload the second csv if the first is in use) or you will have an over-writing issue (your code doesn't check if the file is in use and just over-writes it).

I hope that helps.

|||

Or simply use session-scoped temporary tables like:

CREATE TABLE #Table ...

Of course it'll disappear as soon as you close the connection, and won't interfere with any other connections #Table. (The real name is something god aweful like #Table_________________________________AB9F3E2578A2C) where the last part is random-ish for each different connection, but you call it #Table, and it knows which one you mean.

|||Hey guys,
I realize its been quite sometime. But thanks to both of you for your answers. I have used the 2nd method because I don't have to change my table definition. But thank you for your ideas.

Multiple users logging into a database

I'm developing an application in ASP.Net that uses SQLServer. The database person in the team wants to have each user have a seperate login to the database so he can tell who has done what changes in the database. Is this a good practice? Or are their performance\security issues with this model?For system performance it is better to use one account (to use one connection string) and connection pooling. My solution is as follows. Create your own users and log-in system. You can save current username or id to Session. Then create a table system_events to your database. Everytime user makes something you can insert a new row that describes user action. Also, and it is maybe even better, you can use plain text logs for this.|||We implemented this feature and what we did was write a stored procedure to check the old value against the new values being updated in the database and then it writes only the changes to a seperate table designed for tracking changes. We then wrote a page to display change history by date range in a D-Grid. It works very well and there has been no noticable performance hit since implementation.

CREATE PROCEDURE AddChangeHistory
(
@.propertyID INT,
@.userName VARCHAR(75), --Assigned in session when a user logins
@.tableName VARCHAR(75),
@.fieldName VARCHAR(80),
@.fieldChangeVal varchar(200)
)

AS

DECLARE @.fieldInitVal varchar(200)

BEGIN TRAN
CREATE TABLE #tempVal
(
fieldInitVal VARCHAR(200)
)

INSERT INTO #tempVal exec('SELECT TOP 1 ' + @.fieldName + ' FROM ' + @.tableName + ' WHERE propertyID=' + @.propertyID + '')

SELECT @.fieldInitVal = (SELECT top 1 * FROM #tempVal)

DROP TABLE #tempVal

IF(LOWER(@.fieldInitVal) <> LOWER(@.fieldChangeVal))
BEGIN
INSERT INTO ChangeHistory(propertyID, userName, tableName, fieldName, fieldInitVal, fieldChangeVal)
VALUES (@.propertyID, @.userName, @.tableName, @.fieldName, @.fieldInitVal, @.fieldChangeVal)
END

IF(@.@.ERROR <> 0)
BEGIN
ROLLBACK TRAN
END
ELSE
BEGIN
COMMIT TRAN
END
GO


SPROC to update my database
CREATE PROCEDURE dbo.oPK_UpdateAdminHours
(
@.Role AS varchar(100),
@.UserName AS varchar(100),
@.vcManager AS varchar(50),
@.vcAssistant1 AS varchar(50),
@.vcAssistant2 AS varchar(50),
@.vcLeasingManager AS varchar(50),
@.vcMarketingDirector AS varchar(50),
@.vcLeasing1 AS varchar(50),
@.vcLeasing2 AS varchar(50),
@.vcLeasing3 AS varchar(50),
@.vcLeasing4 AS varchar(50),
@.vcLeasing5 AS varchar(50),
@.bMon_Fri AS bit,
@.cDisplaySummer_Winter AS char(1),
@.vcOfficeHoursSummerMon AS varchar(25),
@.vcOfficeHoursWinterMon AS varchar(25),
@.vcOfficeHoursSummerTue AS varchar(25),
@.vcOfficeHoursWinterTue AS varchar(25),
@.vcOfficeHoursSummerWed AS varchar(25),
@.vcOfficeHoursWinterWed AS varchar(25),
@.vcOfficeHoursSummerThur AS varchar(25),
@.vcOfficeHoursWinterThur AS varchar(25),
@.vcOfficeHoursSummerFri AS varchar(25),
@.vcOfficeHoursWinterFri AS varchar(25),
@.vcOfficeHoursSummerSat AS varchar(25),
@.vcOfficeHoursWinterSat AS varchar(25),
@.vcOfficeHoursSummerSun AS varchar(25),
@.vcOfficeHoursWinterSun AS varchar(25),
@.vcTimeZone AS varchar(50),
@.PropertyID AS INT
)
AS
--Calling the AddChangeHistory SPROC to check the values before updating the Database
exec AddChangeHistory @.propertyID, @.userName, 'vcManager',@.vcManager
exec AddChangeHistory @.propertyID, @.userName, 'vcAssistant1',@.vcAssistant1
exec AddChangeHistory @.propertyID, @.userName, 'vcAssistant2',@.vcAssistant2
exec AddChangeHistory @.propertyID, @.userName, 'vcLeasingManager',@.vcLeasingManager
exec AddChangeHistory @.propertyID, @.userName, 'vcMarketingDirector',@.vcMarketingDirector
exec AddChangeHistory @.propertyID, @.userName, 'vcLeasing1',@.vcLeasing1
exec AddChangeHistory @.propertyID, @.userName, 'vcLeasing2',@.vcLeasing2
exec AddChangeHistory @.propertyID, @.userName, 'vcLeasing3',@.vcLeasing3
exec AddChangeHistory @.propertyID, @.userName, 'vcLeasing4',@.vcLeasing4
exec AddChangeHistory @.propertyID, @.userName, 'vcLeasing5',@.vcLeasing5
exec AddChangeHistory @.propertyID, @.userName, 'bMon_Fri',@.bMon_Fri
exec AddChangeHistory @.propertyID, @.userName, 'cDisplaySummer_Winter',@.cDisplaySummer_Winter
exec AddChangeHistory @.propertyID, @.userName, 'vcOfficeHoursSummerMon',@.vcOfficeHoursSummerMon
exec AddChangeHistory @.propertyID, @.userName, 'vcOfficeHoursWinterMon',@.vcOfficeHoursWinterMon
exec AddChangeHistory @.propertyID, @.userName, 'vcOfficeHoursSummerTue',@.vcOfficeHoursSummerTue
exec AddChangeHistory @.propertyID, @.userName, 'vcOfficeHoursWinterTue',@.vcOfficeHoursWinterTue
exec AddChangeHistory @.propertyID, @.userName, 'vcOfficeHoursSummerWed',@.vcOfficeHoursSummerWed
exec AddChangeHistory @.propertyID, @.userName, 'vcOfficeHoursWinterWed',@.vcOfficeHoursWinterWed
exec AddChangeHistory @.propertyID, @.userName, 'vcOfficeHoursSummerThur',@.vcOfficeHoursSummerThur
exec AddChangeHistory @.propertyID, @.userName, 'vcOfficeHoursWinterThur',@.vcOfficeHoursWinterThur
exec AddChangeHistory @.propertyID, @.userName, 'vcOfficeHoursSummerFri',@.vcOfficeHoursSummerFri
exec AddChangeHistory @.propertyID, @.userName, 'vcOfficeHoursWinterFri',@.vcOfficeHoursWinterFri
exec AddChangeHistory @.propertyID, @.userName, 'vcOfficeHoursSummerSat',@.vcOfficeHoursSummerSat
exec AddChangeHistory @.propertyID, @.userName, 'vcOfficeHoursWinterSat',@.vcOfficeHoursWinterSat
exec AddChangeHistory @.propertyID, @.userName, 'vcOfficeHoursSummerSun',@.vcOfficeHoursSummerSun
exec AddChangeHistory @.propertyID, @.userName, 'vcOfficeHoursWinterSun',@.vcOfficeHoursWinterSun
exec AddChangeHistory @.propertyID, @.userName, 'vcTimeZone ',@.vcTimeZone

--Put your Update code here (removed for space)


Hope this helps.|||Thank you so much. That is what I needed.

Multiple Updates at once?

I have an application that, upon exiting, has to update upwards of 300+
records in a SQL Server database.
There are issues with timeouts due to network traffic and server load.
I was wondering if there was a better procedure or method of transferring a
large group of simple sql tasks to the server that could perhaps be done in
one complete action. this may alleviate the timeout issue.
Thank you for your help.I take it you aren't making a single call to a stored proc? Or the 300
records can't be updated with a single UPDATE statement?
Bryce|||300 rows isn't that many to update - provided you don't do it as 300
separate UPDATE statements. Are you using stored procedures to perform the
updates? Where does the data come from?
David Portas
SQL Server MVP
--|||I am not familiar with stored procedures. How would I make a stored
procedure that would change multiple records in a database based on
different criteria, ie replace UPDATE queries like these ones.
UPDATE Movies SET MovieClassNo = 1 WHERE (MovieID = 'A1241234')
UPDATE Movies SET MovieClassNo = 2 WHERE (MovieID = 'F19892346')
UPDATE Movies SET MovieClassNo = 3 WHERE (MovieID = 'L34982423')
UPDATE Movies SET MovieClassNo = 1 WHERE (MovieID = 'Q98645949')
I have looked in the BOL, and I jsut get kinda lost when I do.
"David Portas" <REMOVE_BEFORE_REPLYING_dportas@.acm.org> wrote in message
news:IY6dnSoM4vCqM63fRVn-oA@.giganews.com...
> 300 rows isn't that many to update - provided you don't do it as 300
> separate UPDATE statements. Are you using stored procedures to perform the
> updates? Where does the data come from?
> --
> David Portas
> SQL Server MVP
> --
>|||Atley wrote:
> I am not familiar with stored procedures. How would I make a stored
> procedure that would change multiple records in a database based on
> different criteria, ie replace UPDATE queries like these ones.
> UPDATE Movies SET MovieClassNo = 1 WHERE (MovieID = 'A1241234')
> UPDATE Movies SET MovieClassNo = 2 WHERE (MovieID = 'F19892346')
> UPDATE Movies SET MovieClassNo = 3 WHERE (MovieID = 'L34982423')
> UPDATE Movies SET MovieClassNo = 1 WHERE (MovieID = 'Q98645949')
>
Are those movieID values hard-coded or are they parameters that need to
be passed into the procedure? What abou the MoveiClassNo? If they are
parameters and you have 300 updates to run, then I'd want to know what
this process is and why you need to update 300 rows in the Movies table
when the application terminates. Passing 300 (or possible 600)
parameters to a procedure is not ideal. In that case, it would probably
be easier to call the procedure 300 times, once for each movieID.
But I'd still like to understand better the reason for the updates and
why there are so many at application termination.
David Gugick
Imceda Software
www.imceda.com|||In principle you can do something like this:
UPDATE Movies
SET movieclassno =
CASE movieid
WHEN 'A1241234' THEN 1
WHEN 'F19892346' THEN 2
WHEN 'L34982423' THEN 3
WHEN 'Q98645949' THEN 1
END
WHERE movieid IN
('A1241234',
'F19892346',
'L34982423',
'Q98645949')
but it's better to use stored procedure parameters to pass those values
rather than construct the SQL dynamically at the client. The question
is, why are you batching up all 300 updates to execute them in one go?
David Portas
SQL Server MVP
--|||These are Movies that are being transfered from one location to another in a
bulk transfer
So the movies are being scanned in, and then the location ID has to be
changed in one fell sweep.
The parameters are not known until the waybill is made and the movies are
scanned in.
"David Portas" <REMOVE_BEFORE_REPLYING_dportas@.acm.org> wrote in message
news:1110536761.241514.182540@.g14g2000cwa.googlegroups.com...
> In principle you can do something like this:
> UPDATE Movies
> SET movieclassno =
> CASE movieid
> WHEN 'A1241234' THEN 1
> WHEN 'F19892346' THEN 2
> WHEN 'L34982423' THEN 3
> WHEN 'Q98645949' THEN 1
> END
> WHERE movieid IN
> ('A1241234',
> 'F19892346',
> 'L34982423',
> 'Q98645949')
> but it's better to use stored procedure parameters to pass those values
> rather than construct the SQL dynamically at the client. The question
> is, why are you batching up all 300 updates to execute them in one go?
> --
> David Portas
> SQL Server MVP
> --
>|||These are Movies that are being transfered from one location to another in a
bulk transfer
So the movies are being scanned in, and then the location ID has to be
changed in one fell sweep.
The parameters are not known until the waybill is made and the movies are
scanned in.
This has to be done before the application is closed or the waybill is
printed, or the waybill is closed.
"David Gugick" <davidg-nospam@.imceda.com> wrote in message
news:uAWkxjdJFHA.3960@.TK2MSFTNGP09.phx.gbl...
> Atley wrote:
> Are those movieID values hard-coded or are they parameters that need to
> be passed into the procedure? What abou the MoveiClassNo? If they are
> parameters and you have 300 updates to run, then I'd want to know what
> this process is and why you need to update 300 rows in the Movies table
> when the application terminates. Passing 300 (or possible 600)
> parameters to a procedure is not ideal. In that case, it would probably
> be easier to call the procedure 300 times, once for each movieID.
> But I'd still like to understand better the reason for the updates and
> why there are so many at application termination.
>
> --
> David Gugick
> Imceda Software
> www.imceda.com
>|||Atley wrote:
> These are Movies that are being transfered from one location to
> another in a bulk transfer
> So the movies are being scanned in, and then the location ID has to be
> changed in one fell sweep.
> The parameters are not known until the waybill is made and the movies
> are scanned in.
> This has to be done before the application is closed or the waybill is
> printed, or the waybill is closed.
>
> "David Gugick" <davidg-nospam@.imceda.com> wrote in message
> news:uAWkxjdJFHA.3960@.TK2MSFTNGP09.phx.gbl...
I would try writing a stored procedure to do the update and calling it
300 times from the application. It should happen very quickly. Pass the
MovieClassNo and MoveID to the procedure. This option keeps all the
intelligence in the database.
Before I would consider anything else, I would try that.
David Gugick
Imceda Software
www.imceda.com|||CREATE TABLE WayBills
(movieid CHAR (8) NOT NULL PRIMARY KEY,
movie_class_nbr INTEGER NOT NULL
CHECK (movie_class_nbr BETWEEN 1 AND 4));
Then use the collected data:
UPDATE Movies
SET movie_class_nbr
= (SELECT movie_class_nbr
FROM WayBills AS W1
WHERE W1.movie_id = Movies.movie-id)
WHERE EXISTS (SELECT *
FROM WayBills AS W1
WHERE W1.movie_id = Movies.movie_id);

Multiple types of data storing in SQL Server

Hello All,

We are developing customer support application. We will have so many customers after launching this product. But my problem is how will i store data of all these customers in SQL Server. Please suggest me.

You have not provided enough information about the nature of the problem for us to help you find a solution.

Multiple threads or not?

SQL 7.0/NT 4.0 latest SPs
Performance question.
One application will be writing to two tables. 1000s of
records at a time, in bursts, and at the same time, the
tables will be serving up data to end-users from a
separate application.
Question: Should there be a separate thread written to
each table that the appliation is writing to? Would that
help performance getting the records in faster and reduce
the amount of over-head so that the users making their
requests aren't slow in getting their data.
Thank you for your help.
DonHow are you inserting them? Have you tried using Bulk Insert?
--
Andrew J. Kelly
SQL Server MVP
"Don" <ddachner@.yahoo.com> wrote in message
news:04b701c37ed2$a0adc300$a101280a@.phx.gbl...
> SQL 7.0/NT 4.0 latest SPs
> Performance question.
> One application will be writing to two tables. 1000s of
> records at a time, in bursts, and at the same time, the
> tables will be serving up data to end-users from a
> separate application.
> Question: Should there be a separate thread written to
> each table that the appliation is writing to? Would that
> help performance getting the records in faster and reduce
> the amount of over-head so that the users making their
> requests aren't slow in getting their data.
> Thank you for your help.
> Don
>|||The developer is using INSERT. Looks like one record at
at time from C++.
Bulk insert would be better?
Don
>--Original Message--
>How are you inserting them? Have you tried using Bulk
Insert?
>--
>Andrew J. Kelly
>SQL Server MVP
>
>"Don" <ddachner@.yahoo.com> wrote in message
>news:04b701c37ed2$a0adc300$a101280a@.phx.gbl...
>> SQL 7.0/NT 4.0 latest SPs
>> Performance question.
>> One application will be writing to two tables. 1000s of
>> records at a time, in bursts, and at the same time, the
>> tables will be serving up data to end-users from a
>> separate application.
>> Question: Should there be a separate thread written to
>> each table that the appliation is writing to? Would
that
>> help performance getting the records in faster and
reduce
>> the amount of over-head so that the users making their
>> requests aren't slow in getting their data.
>> Thank you for your help.
>> Don
>
>.
>|||A Bulk Insert would generally be many times faster than individual inserts.
--
Andrew J. Kelly
SQL Server MVP
"Don" <ddachner@.yahoo.com> wrote in message
news:055401c37ee9$c88931e0$a401280a@.phx.gbl...
> The developer is using INSERT. Looks like one record at
> at time from C++.
> Bulk insert would be better?
> Don
> >--Original Message--
> >How are you inserting them? Have you tried using Bulk
> Insert?
> >
> >--
> >
> >Andrew J. Kelly
> >SQL Server MVP
> >
> >
> >"Don" <ddachner@.yahoo.com> wrote in message
> >news:04b701c37ed2$a0adc300$a101280a@.phx.gbl...
> >> SQL 7.0/NT 4.0 latest SPs
> >>
> >> Performance question.
> >>
> >> One application will be writing to two tables. 1000s of
> >> records at a time, in bursts, and at the same time, the
> >> tables will be serving up data to end-users from a
> >> separate application.
> >>
> >> Question: Should there be a separate thread written to
> >> each table that the appliation is writing to? Would
> that
> >> help performance getting the records in faster and
> reduce
> >> the amount of over-head so that the users making their
> >> requests aren't slow in getting their data.
> >>
> >> Thank you for your help.
> >>
> >> Don
> >>
> >
> >
> >.
> >|||Bulk insert techniques are the fastest way to load large volumes of data
into SQL Server. You can use SQLOLEDB IRowserFastLoad or ODBC Bulk Copy
to load data directly from a C++ program. To load data from files, you
can use T-SQL BULK INSERT, the BCP command-line utility or DTS.
Parallel loading (e.g. multiple threads) can improve throughput further.
However, you may not need to resort to that since you can probably load
thousands of rows in a few seconds.
--
Hope this helps.
Dan Guzman
SQL Server MVP
--
SQL FAQ links (courtesy Neil Pike):
http://www.ntfaq.com/Articles/Index.cfm?DepartmentID=800
http://www.sqlserverfaq.com
http://www.mssqlserver.com/faq
--
"Don" <ddachner@.yahoo.com> wrote in message
news:055401c37ee9$c88931e0$a401280a@.phx.gbl...
> The developer is using INSERT. Looks like one record at
> at time from C++.
> Bulk insert would be better?
> Don
> >--Original Message--
> >How are you inserting them? Have you tried using Bulk
> Insert?
> >
> >--
> >
> >Andrew J. Kelly
> >SQL Server MVP
> >
> >
> >"Don" <ddachner@.yahoo.com> wrote in message
> >news:04b701c37ed2$a0adc300$a101280a@.phx.gbl...
> >> SQL 7.0/NT 4.0 latest SPs
> >>
> >> Performance question.
> >>
> >> One application will be writing to two tables. 1000s of
> >> records at a time, in bursts, and at the same time, the
> >> tables will be serving up data to end-users from a
> >> separate application.
> >>
> >> Question: Should there be a separate thread written to
> >> each table that the appliation is writing to? Would
> that
> >> help performance getting the records in faster and
> reduce
> >> the amount of over-head so that the users making their
> >> requests aren't slow in getting their data.
> >>
> >> Thank you for your help.
> >>
> >> Don
> >>
> >
> >
> >.
> >

Monday, March 12, 2012

multiple tempdb references in master sysaltfiles

Hi,
We tried to create a new database on an application server (Win 2003
server/SQL Server 2003) and got the following error.
error 945...
....
device activation error. The physical filename g:\mssqldata
\templog.ldf may be incorrect.
The problem is that tempdb is on f:\mssql as shown in the database
properties and with sp_dbhelp.
Poking around in master the real wierdness comes through. sysaltfiles
has 2 entries each for tempdb logs and data files. One of them is on
g: and one is on f:. The lower dbid is on f: and the higher one is on
g: (actually the last two rows in the table).
I hope that makes sense: there're 4 rows for tempdb. one pair points
the log and data files to the g: drive and one pair of rows points
them to the f: drive. The references to g: drive need to go away but
I've been googinling for a while now and haven't come up with much.
I tried running the alter database command normally used to move
tempdb and though it didn't fail, it didn't change anything.
Several months ago our software vendor moved tempdb from g: to f: to
try and speed it up a bit. Appearantly they messed it up and now have
written us off till WE fix it.
The entries in sysaltfiles were the only references to g: that turned
up (though we didn't look at every table and aren't even remotely sure
where other references might be located).
Any pointers on getting this corrected would be greatly apprecieated.
We thought about trying a reconfigure and restarting but I'm not real
hopeful. We also thought about just updating the wrong entries to
reflect the right locations but that smacks of kluge.
Tangential wierdness is that while trying to isolate the source of g:
\mssql in the error I found that in master.sysdevices the file
location is e:\Program Files\Microsoft SQL Server\MSSQL\data
\tempdb.mdf.
I believe this is from the initial install then while configuring the
server it got moved to g: then to f:.
HELP!!!
Thanks in advance for any input!
Rusty
I've been able to verify that sysdatabases reference to tempdb points
at dbid2 which in sysaltfiles is the dbid of the two rows that point
at the CORRECT file locations....
Im starting to think it might be okay to just remove the incorrect
rows out of sysaltfiles and restart. B
But the thought gives me the screaming heebie-jeebies!
|||Hi
"rnbwil@.gmail.com" wrote:

> Hi,
> We tried to create a new database on an application server (Win 2003
> server/SQL Server 2003) and got the following error.
>
> error 945...
> ....
> device activation error. The physical filename g:\mssqldata
> \templog.ldf may be incorrect.
> The problem is that tempdb is on f:\mssql as shown in the database
> properties and with sp_dbhelp.
> Poking around in master the real wierdness comes through. sysaltfiles
> has 2 entries each for tempdb logs and data files. One of them is on
> g: and one is on f:. The lower dbid is on f: and the higher one is on
> g: (actually the last two rows in the table).
> I hope that makes sense: there're 4 rows for tempdb. one pair points
> the log and data files to the g: drive and one pair of rows points
> them to the f: drive. The references to g: drive need to go away but
> I've been googinling for a while now and haven't come up with much.
> I tried running the alter database command normally used to move
> tempdb and though it didn't fail, it didn't change anything.
> Several months ago our software vendor moved tempdb from g: to f: to
> try and speed it up a bit. Appearantly they messed it up and now have
> written us off till WE fix it.
> The entries in sysaltfiles were the only references to g: that turned
> up (though we didn't look at every table and aren't even remotely sure
> where other references might be located).
> Any pointers on getting this corrected would be greatly apprecieated.
> We thought about trying a reconfigure and restarting but I'm not real
> hopeful. We also thought about just updating the wrong entries to
> reflect the right locations but that smacks of kluge.
> Tangential wierdness is that while trying to isolate the source of g:
> \mssql in the error I found that in master.sysdevices the file
> location is e:\Program Files\Microsoft SQL Server\MSSQL\data
> \tempdb.mdf.
> I believe this is from the initial install then while configuring the
> server it got moved to g: then to f:.
> HELP!!!
> Thanks in advance for any input!
> Rusty
>
When creating the database I would not expect tempdb to have anything to do
with it!
What command have you used to create this database? It seems to me that you
have specified g:\mssqldata\templog.ldf as the log file instead of
g:\mssql\data\NewDatabase_log.ldf? Check that the default database locations
have been correctly set.
What happens is you just use the T-SQL
CREATE DATABASE NewDatabase
John
|||On May 31, 11:52 pm, John Bell <jbellnewspo...@.hotmail.com> wrote:
> Hi
>
>
> "rnb...@.gmail.com" wrote:
>
>
>
>
>
>
>
>
> When creating the database I would not expect tempdb to have anything to do
> with it!
> What command have you used to create this database? It seems to me that you
> have specified g:\mssqldata\templog.ldf as the log file instead of
> g:\mssql\data\NewDatabase_log.ldf? Check that the default database locations
> have been correctly set.
> What happens is you just use the T-SQL
> CREATE DATABASE NewDatabase
> John- Hide quoted text -
> - Show quoted text -
John - thanks for the input. I wouldn't think it would either but...
This is the output from 'create database test123' in QA.
Server: Msg 945, Level 14, State 2, Line 1
Database 'test123' cannot be opened due to inaccessible files or
insufficient memory or disk space. See the SQL Server errorlog for
details.
The CREATE DATABASE process is allocating 0.63 MB on disk 'test123'.
The CREATE DATABASE process is allocating 0.49 MB on disk
'test123_log'.
Device activation error. The physical file name 'g:\Sqldata
\templog.ldf' may be incorrect.
|||Hi
"rnbwil@.gmail.com" wrote:

> On May 31, 11:52 pm, John Bell <jbellnewspo...@.hotmail.com> wrote:
> John - thanks for the input. I wouldn't think it would either but...
>
> This is the output from 'create database test123' in QA.
> Server: Msg 945, Level 14, State 2, Line 1
> Database 'test123' cannot be opened due to inaccessible files or
> insufficient memory or disk space. See the SQL Server errorlog for
> details.
> The CREATE DATABASE process is allocating 0.63 MB on disk 'test123'.
> The CREATE DATABASE process is allocating 0.49 MB on disk
> 'test123_log'.
> Device activation error. The physical file name 'g:\Sqldata
> \templog.ldf' may be incorrect.
>
That is a different directory on the G Drive to the one you initially posted!
CREATE DATABASE test123
ON ( NAME = Test123_dat,
FILENAME = 'F:\mssql\data\test123.mdf',
SIZE = 10,
FILEGROWTH = 5 )
LOG ON
( NAME = 'Test123_log',
FILENAME = 'F:\mssql\data\test123.ldf',
SIZE = 5MB,
FILEGROWTH = 5MB )
GO
See what sp_helpfiles returns when you are in tempdb and model. You may then
want to try moving tempdb using the ALTER DATABASE command as described in
http://support.microsoft.com/kb/224071/
John
|||Tried the above and received the same error.
I'm not sure moving tempdb will help.
I ran the alter database sql (per the ms kb on moving tempdb)
specifying the current location and it didn't error out but didn't do
anything to the extraneous entries.

multiple tempdb references in master sysaltfiles

Hi,
We tried to create a new database on an application server (Win 2003
server/SQL Server 2003) and got the following error.
error 945...
...
device activation error. The physical filename g:\mssqldata
\templog.ldf may be incorrect.
The problem is that tempdb is on f:\mssql as shown in the database
properties and with sp_dbhelp.
Poking around in master the real wierdness comes through. sysaltfiles
has 2 entries each for tempdb logs and data files. One of them is on
g: and one is on f:. The lower dbid is on f: and the higher one is on
g: (actually the last two rows in the table).
I hope that makes sense: there're 4 rows for tempdb. one pair points
the log and data files to the g: drive and one pair of rows points
them to the f: drive. The references to g: drive need to go away but
I've been googinling for a while now and haven't come up with much.
I tried running the alter database command normally used to move
tempdb and though it didn't fail, it didn't change anything.
Several months ago our software vendor moved tempdb from g: to f: to
try and speed it up a bit. Appearantly they messed it up and now have
written us off till WE fix it.
The entries in sysaltfiles were the only references to g: that turned
up (though we didn't look at every table and aren't even remotely sure
where other references might be located).
Any pointers on getting this corrected would be greatly apprecieated.
We thought about trying a reconfigure and restarting but I'm not real
hopeful. We also thought about just updating the wrong entries to
reflect the right locations but that smacks of kluge.
Tangential wierdness is that while trying to isolate the source of g:
\mssql in the error I found that in master.sysdevices the file
location is e:\Program Files\Microsoft SQL Server\MSSQL\data
\tempdb.mdf.
I believe this is from the initial install then while configuring the
server it got moved to g: then to f:.
HELP!!!
Thanks in advance for any input!
RustyI've been able to verify that sysdatabases reference to tempdb points
at dbid2 which in sysaltfiles is the dbid of the two rows that point
at the CORRECT file locations....
Im starting to think it might be okay to just remove the incorrect
rows out of sysaltfiles and restart. B
But the thought gives me the screaming heebie-jeebies!|||Hi
"rnbwil@.gmail.com" wrote:

> Hi,
> We tried to create a new database on an application server (Win 2003
> server/SQL Server 2003) and got the following error.
>
> error 945...
> ....
> device activation error. The physical filename g:\mssqldata
> \templog.ldf may be incorrect.
> The problem is that tempdb is on f:\mssql as shown in the database
> properties and with sp_dbhelp.
> Poking around in master the real wierdness comes through. sysaltfiles
> has 2 entries each for tempdb logs and data files. One of them is on
> g: and one is on f:. The lower dbid is on f: and the higher one is on
> g: (actually the last two rows in the table).
> I hope that makes sense: there're 4 rows for tempdb. one pair points
> the log and data files to the g: drive and one pair of rows points
> them to the f: drive. The references to g: drive need to go away but
> I've been googinling for a while now and haven't come up with much.
> I tried running the alter database command normally used to move
> tempdb and though it didn't fail, it didn't change anything.
> Several months ago our software vendor moved tempdb from g: to f: to
> try and speed it up a bit. Appearantly they messed it up and now have
> written us off till WE fix it.
> The entries in sysaltfiles were the only references to g: that turned
> up (though we didn't look at every table and aren't even remotely sure
> where other references might be located).
> Any pointers on getting this corrected would be greatly apprecieated.
> We thought about trying a reconfigure and restarting but I'm not real
> hopeful. We also thought about just updating the wrong entries to
> reflect the right locations but that smacks of kluge.
> Tangential wierdness is that while trying to isolate the source of g:
> \mssql in the error I found that in master.sysdevices the file
> location is e:\Program Files\Microsoft SQL Server\MSSQL\data
> \tempdb.mdf.
> I believe this is from the initial install then while configuring the
> server it got moved to g: then to f:.
> HELP!!!
> Thanks in advance for any input!
> Rusty
>
When creating the database I would not expect tempdb to have anything to do
with it!
What command have you used to create this database? It seems to me that you
have specified g:\mssqldata\templog.ldf as the log file instead of
g:\mssql\data\NewDatabase_log.ldf? Check that the default database locations
have been correctly set.
What happens is you just use the T-SQL
CREATE DATABASE NewDatabase
John|||On May 31, 11:52 pm, John Bell <jbellnewspo...@.hotmail.com> wrote:
> Hi
>
>
> "rnb...@.gmail.com" wrote:
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
> When creating the database I would not expect tempdb to have anything to d
o
> with it!
> What command have you used to create this database? It seems to me that yo
u
> have specified g:\mssqldata\templog.ldf as the log file instead of
> g:\mssql\data\NewDatabase_log.ldf? Check that the default database locatio
ns
> have been correctly set.
> What happens is you just use the T-SQL
> CREATE DATABASE NewDatabase
> John- Hide quoted text -
> - Show quoted text -
John - thanks for the input. I wouldn't think it would either but...
This is the output from 'create database test123' in QA.
Server: Msg 945, Level 14, State 2, Line 1
Database 'test123' cannot be opened due to inaccessible files or
insufficient memory or disk space. See the SQL Server errorlog for
details.
The CREATE DATABASE process is allocating 0.63 MB on disk 'test123'.
The CREATE DATABASE process is allocating 0.49 MB on disk
'test123_log'.
Device activation error. The physical file name 'g:\Sqldata
\templog.ldf' may be incorrect.|||Hi
"rnbwil@.gmail.com" wrote:

> On May 31, 11:52 pm, John Bell <jbellnewspo...@.hotmail.com> wrote:
> John - thanks for the input. I wouldn't think it would either but...
>
> This is the output from 'create database test123' in QA.
> Server: Msg 945, Level 14, State 2, Line 1
> Database 'test123' cannot be opened due to inaccessible files or
> insufficient memory or disk space. See the SQL Server errorlog for
> details.
> The CREATE DATABASE process is allocating 0.63 MB on disk 'test123'.
> The CREATE DATABASE process is allocating 0.49 MB on disk
> 'test123_log'.
> Device activation error. The physical file name 'g:\Sqldata
> \templog.ldf' may be incorrect.
>
That is a different directory on the G Drive to the one you initially posted
!
CREATE DATABASE test123
ON ( NAME = Test123_dat,
FILENAME = 'F:\mssql\data\test123.mdf',
SIZE = 10,
FILEGROWTH = 5 )
LOG ON
( NAME = 'Test123_log',
FILENAME = 'F:\mssql\data\test123.ldf',
SIZE = 5MB,
FILEGROWTH = 5MB )
GO
See what sp_helpfiles returns when you are in tempdb and model. You may then
want to try moving tempdb using the ALTER DATABASE command as described in
http://support.microsoft.com/kb/224071/
John|||Tried the above and received the same error.
I'm not sure moving tempdb will help.
I ran the alter database sql (per the ms kb on moving tempdb)
specifying the current location and it didn't error out but didn't do
anything to the extraneous entries.