Quantcast
Channel: Red Gate forums: SQL Backup 7
Viewing all 713 articles
Browse latest View live

RE: Tran Log backups on FusionIO cards are 6 times larger

$
0
0
*-*-*-*-* The backups are now 6 times larger but the database activity has not changed *-*-*-*-* Cool

UNC path and backups over the network

$
0
0
Hi,

I am having a very interesting debate with one of my IT resources. Due the backup storage solution he previously bought, I've been asked to drop backup files over the network.

Now, having offsite backups (and rotate or keep those in a removable media) is perfectly fine, but the backup pipe does not look enough to me. I am actually having some issues due SAN and network bandwidth. So, long story short, I am trying to re-design the existing backup jobs so I can use that share.

I've been doing some testing, and it looks like RedGate 7.0 "does not like" when I map the share drive, let's say using "Z" drive, and specify that in the backup job. Instead, if I use the UNC path, like \\server\whatever folder, the backup jobs seems to work without an issue.

Here are my two questions, or actually two questions and requesting one technical advice Wink

1. Can I specify on RedGate, step #3 of scheduling backups, a UNC or regular Windows mapping, on the Folder field? Or a UNC path is mandatory>

2. If I can specify UNC or regular drive letter paths when using a folder, what could be the problem with the Windows mapping, as permissions look to be ok.

The technical advice or question would be ... should I use the "copy backup to network" option or schedule a total separate job for the offsite backups? I am trying to reduce network and IO stress on the SAN Cluster and network.

And should I stick to the UNC or create a Windows mapping? I know RedGate is Cluster aware, and assuming the UNC does not change, it should be more stable than a Windows mapping with a given letter, am I right about that?

RE: UNC path and backups over the network

$
0
0
Hi sql-lover,

Thank you for your post into the forum.

SQL Backup does not support mapped drives, so you will need to specify an UNC path to the network share.

As you may know, the SQL Backup processes are controlled by the SQL Backup Agent Service. The SQL Backup Agent service is a Windows service which is used to perform the SQL Backup backup and restore operations through the graphical user interface or the extended stored procedure.

When a Windows account runs as a service it does not behave in the same manner as a normal logged-in user account, with regards to mapped drives, in that it does not make use of mapped drives. Hence why SQL Backup does not support mapped drives.

So to answer your questions:
Quote:

1. Can I specify on RedGate, step #3 of scheduling backups, a UNC or regular Windows mapping, on the Folder field? Or a UNC path is mandatory>

2. If I can specify UNC or regular drive letter paths when using a folder, what could be the problem with the Windows mapping, as permissions look to be ok.


You need to set an UNC path to the share and the account you give the SQL Backup Agent service will require security permissions to the share.

Quote:

The technical advice or question would be ... should I use the "copy backup to network" option or schedule a total separate job for the offsite backups? I am trying to reduce network and IO stress on the SAN Cluster and network.


As you have a requirement to reduce network and IO stress, then my recommendation is yes to select the option to "copy backup to network", if you have sufficient free disk space to create the local copy. Although SQL Backup is capable of backing up directly to a network share, it will add an overhead to the process and it is not recommended practice. However there are many SQL Backup users who back directly across their networks to a share. I recommend that you evaluate both options in a test environment before applying to your production machines.

There is an unsupported stand-alone copy tool that can be made available to you. If you wish to evaluate the stand-alone copy tool, please e-mail the Red Gate Support team, support@red-gate.com and request the stand-alone SQL Backup Copy tool.

Many Thanks
Eddie

Capturing the SQL Backup message output

$
0
0
In addition to capturing the @exitcode OUTPUT and @sqlerrorcode OUTPUT from the command line, I wish to also capture the the SQL Backup messages that are returned in the first dataset. Can anyone tell me how to do this?

[/img]

RE: Capturing the SQL Backup message output

$
0
0
Use the SINGLERESULTSET option e.g.

Code:
CREATE TABLE #sqb (DATA NVARCHAR(4000))

INSERT INTO #sqb EXECUTE master..sqlbackup '-sql "BACKUP DATABASE pubs TO DISK = [<AUTO>] WITH SINGLERESULTSET"'

Use of file location tags in "Copy backup to network locatio

$
0
0
Can I use the file location tags in the "Copy Backup To Network Location" option? Originally I would like to organize my backups by year-month-day. I have a finite amount of local storage and organizing them this way will make them easier to migrate to long term storage.

However, In order for SQL Backup to only keep a few backups locally and adhere to the recommendation to perform the backups locally, I'll need to organize them by database.

I hope this is clear. If there is a way to achieve this other than the way I'm approaching it I'm open to ideas!

Thanks

Is the copying of backups to a network location separate?

$
0
0
When the "Copy Backups To Network Location" option is used, Is it run on a thread separate from the actual backup process?

It seems to me that it would be preferred that copying the backups would happen independent of the backup process.

Outcome column on Jobs tab is misleading

$
0
0
I'm confused. After I manually start a backup job a few minutes later the Outcome column reports successful but if I look in the Inprogress tab the backup job is still Running...isnt that misleading?

RE: Outcome column on Jobs tab is misleading

$
0
0
Are you verifying the backup?

Chris

RE: Outcome column on Jobs tab is misleading

Object Level Recovery error

$
0
0
I have just been trialing SQL Backup Pro (after being advised HyperBac was being discontinued) and came across an error while trying to restore a table via Object Level Recovery:

Code:

Unhandled Exception
System.IndexOutOfRangeException occured:
Index was outside the bounds oF the array.
at kk.a(Int32)
at kl.a(jV,lc)
at ku.GetDataStream()
at ku.get_Data()
at ks.b()
at kP.a(String, IEnumerable`1 , IEnumerable`1)
at kP.PopulateTable(dY name, lc pageProvider, kH schemaProvider, jE tracker)
at hT.a(List`1)
at hQ.PopulateAdditionalSystemTables(dY[] tables)
at h.a(Int32)
at W.l()
at W.GetSqlCmd(Int32 view)
at h.a(Int32)
at K.b()
at K.RedGate.ObjectLevelRecovery.Engine.CTB.ISCSimplePopulator.get_Sql()
at G.<EnumObjects>d__0.MoveNext()
at u.Read()
at w.Read()
atb.a(aM)
at b.a(aM)
at b.a(aM)
at bC.a(aM)
at bc<>c_DisplayClassf,<ObjectsTreeViewAfterSelect>b__c(Object param0)
at System.Threading._ThreadPoolWaitCallback.WaitCallback_Context(Object state)
at System.Threading.ExecutionContext.Run(ExecutionContext executionContext, ContextCallback callback, Object state)
at System.Threading._ThreadPoolWaitCallback.PerformWaitCallback(Object state)


I wasn't even doing anything tricky. Trying to restore a 5 row table from database A backup into database B. Disk/database free space not an issue. Tried with compressed+encrypted, compressed+no encryption, and no compression+no encryption backups. The table was in the dbo schema, so no issue with missing schema. The table is not using user-defined data types. It's about as simple as it gets.

SQLBackup 7.2.1.4
SQL Server 2005 SP4 (9.0.5000)
.NET 2 (v2.0.50727)
Windows 2003 R2 SP2 Std Edition x86 on VMware
1 CPU
1GB RAM

Ideas? More info required?

Scott.

RE: Use of file location tags in "Copy backup to network locatio

$
0
0
<quote>Can I use the file location tags in the "Copy Backup To Network Location" option?</quote>
Yes you can. E.g.

Code:
EXEC master..sqlbackup '-sql "BACKUP DATABASES [*] TO DISK = [g:\backups\<DATABASE>\<TYPE>\AUTO.sqb] WITH COPYTO = [\\sqlfileserver\backups\<DATABASE>\<TYPE>\<DATETIME yyyy mm>\] WITH ERASEFILES_PRIMARY = 2b, ERASEFILES_SECONDARY = 14b"'


In this example, g:\ is a local folder. This command will back up all your databases to this local folder, in subfolders named after the database, then by backup type. At most, only 2 backup sets will be retained in each folder (ERASEFILES_SECONDARY = 2b). It will also copy the backup file to a network share, in subfolders named after the database, backup type and current year and month. Only 14 backup sets will be retained in each of those folders (ERASEFILES_SECONDARY = 14b).

Why create subfolders named after the database name and backup type? This helps to reduce the number of files in any one of those folders. Whenever you use any of the ERASEFILES_* options, SQL Backup needs to determine which backup files to keep, and which to delete, every time a backup completes. If your instance contains hundreds of database, and you have transaction logs running every 10 minutes, the number of files will grow rapidly. If you only backed up to a folder named 'g:\backups\', SQL Backup will need to process thousands of files every time. This would not be an issue if you only had a few databases backed up.

Likewise for files copied to the network share. Again, if you have a lot of backup files stored remotely, it's advisable to organise the files by database name and backup type for the reasons above.

One cavest when using folders named after the current year and month is that when SQL Backup creates a new folder based on those attributes, it won't look into the previously created folders to delete the older files. E.g. using the example above, SQL Backup may copy today's backup files into

\\sqlfileserver\backups\AdventureWorks\LOG\2013 02\...

We can expect to find 14 backup sets in that folder at any one time, since we're using the ERASEFILES_SECONDARY = 14b option. Tomorrow, when SQL Backup starts storing files in

\\sqlfileserver\backups\AdventureWorks\LOG\2013 03\...

it won't look in the \2013 02\ subfolder to erase the older files. By April, you will have the last 14 backup sets in the February folder, and the last 14 backup sets in the March folder. Which is pretty useless as they won't form a complete recovery chain.

RE: Is the copying of backups to a network location separate?

$
0
0
The copying is performed as part of the backup process for all backup types other than for transaction logs.

Transaction log backup files are copied in a separate process.

RE: Object Level Recovery error

$
0
0
I am the support engineer that is responding to this request. OLR has a few limitations and it seems like you are running into the issues. I am sorry and want to apologise for the trouble this has caused on your side.

Just to get a more understanding of the situation are you able to let me know the following:

1) Location of the backup?
2) The size of the backup file?
3) The size of the database when online in sql server?

I know you have said that it's just a five row table. Just wanted to confirm the above to be sure! Apologies for repeating this!

Also let me now if the source and the destination server for the backup is this same server? or it was taken on a different server?

One possible reason for this could be the version of .Net on this server? Are you able to try this on a machine with version 3.5 and above?

We welcome your business and look forward to hearing about your experience with this tool.

Thanks for your patience and feedback in this matter.

File browser unable to access local resource

$
0
0
I've got around a dozen servers running SQL Backup. We are upgrading one and I've got everything installed.

The backup agent is running as Administrator, and it's still in TRIAL mode, as I haven't gotten a license to use at this time.

When I go to restore a file/Browse for backup files/Add Files, the file browser shows the local server with a red X, saying it cannot connect to the resource.

I've never had this happen before. Running Windows Server 2008 R2, SQL Backup Pro 7.0.4.2.

I'm sure it's something stupid, but.. any ideas?

Thanks,

Marc

RE: File browser unable to access local resource

$
0
0
Sorry.. I should add that I'm logging in as sa.

Programmatically set SMTP host

$
0
0
Hi Everyone,

I wanted to see if there is a way to set the SMTP host information pragmatically rather than going through "Tools" -> "Server Options" -> "Email Settings".

Does anyone know of a way to do this?

I tried checking out the SQL Compact Database and searching for possible config files on the file system but I didn't see anywhere this information is stored on the server. I was browsing the forums as well and I have yet to find anything that might point me in the right direction.

My end goal was to setup a script/process that can go in and update the SMTP host information across all of my SQL Backup installations.

I'm running SQL Backup 7.2.1.82.

RE: Programmatically set SMTP host

$
0
0
Try this:

Code:
EXEC master..sqbutility 1040, 'SMTPHost', 'myhost.smtp.com'
EXEC master..sqbutility 1041, 'SMTPPort', 135

RE: File browser unable to access local resource

$
0
0
Are you able to browse the local folders when logged on to the SQL Server instance (via the SQL Backup GUI) using a Windows authenticated account?

RE: File browser unable to access local resource

$
0
0
Yes, I am. That's what was so confusing. SSMS was able to do things with no problem.

I did eventually find the issue... I think I installed the agent and then changed the SA password. I don't know if the agent uses that password, but I noticed you do have to enter it. There was no way for me to change it (or no way I could find), so I finally uninstalled and reinstalled the agent and everything was fine.
Viewing all 713 articles
Browse latest View live




Latest Images