Welcome! » Log In » Create A New Profile

Search for File

Posted by Anonymous 
Search for File
September 28, 2011 07:17AM
Hi,

I need to search for a specific file on a host, via backuppc. Is there a way to search a host backup, so I don't have to manually go through all directories via the web interface?

Gerald

------------------------------------------------------------------------------
All the data continuously generated in your IT infrastructure contains a
definitive record of customers, application performance, security
threats, fraudulent activity and more. Splunk takes this data and makes
sense of it. Business sense. IT sense. Common sense.
[p.sf.net]
_______________________________________________
BackupPC-users mailing list
BackupPC-users < at > lists.sourceforge.net
List: [lists.sourceforge.net]
Wiki: [backuppc.wiki.sourceforge.net]
Project: [backuppc.sourceforge.net]
Search for File
September 28, 2011 07:40AM
Gerald Brandt <gbr < at > majentis.com> wrote on 09/28/2011 10:15:12 AM:

Quote

I need to search for a specific file on a host, via backuppc. Is
there a way to search a host backup, so I don't have to manually go
through all directories via the web interface?
The easiest, most direct way of doing that would be:

cd /path/to/host/pc/directory
find . | grep "f<filename>"

I'm sure someone with more shell-fu will give you a much better command line (and I look forward to learning something!). I'm sure there's a way to do it simply with the find command alone, but I've had limited success trying to limit the find command to find specific files. For me, it's easier to use grep as above. My way will work, if a bit slowly: there's lots of files in there...

Don't forget the leading f in the filename: BackupPC puts an f in front of every filename in the directory structure.

Tim Massey
Out of the Box Solutions, Inc.
Creative IT Solutions Made Simple!

http://www.OutOfTheBoxSolutions.com
tmassey < at > obscorp.com 22108 Harper Ave.
St. Clair Shores, MI 48080
Office: (800)750-4OBS (4627)
Cell: (586)945-8796
Search for File
September 28, 2011 07:53AM
Hi Tim,

That's basically what I did, but I have a couple of BackupPC users that have no clue about command line stuff, so I was hoping for a BackupPC web based solution.

Gerald

Quote

From: "Timothy J Massey" <tmassey < at > obscorp.com>
To: "General list for user discussion, questions and support" <backuppc-users < at > lists.sourceforge.net>
Sent: Wednesday, September 28, 2011 9:30:18 AM
Subject: Re: [BackupPC-users] Search for File

Gerald Brandt <gbr < at > majentis.com> wrote on 09/28/2011 10:15:12 AM:

Quote

I need to search for a specific file on a host, via backuppc. Is
there a way to search a host backup, so I don't have to manually go
through all directories via the web interface?
The easiest, most direct way of doing that would be:

cd /path/to/host/pc/directory
find . | grep "f<filename>"

I'm sure someone with more shell-fu will give you a much better command line (and I look forward to learning something!). I'm sure there's a way to do it simply with the find command alone, but I've had limited success trying to limit the find command to find specific files. For me, it's easier to use grep as above. My way will work, if a bit slowly: there's lots of files in there...

Don't forget the leading f in the filename: BackupPC puts an f in front of every filename in the directory structure.

Tim Massey
Out of the Box Solutions, Inc.
Creative IT Solutions Made Simple!

http://www.OutOfTheBoxSolutions.com
tmassey < at > obscorp.com (tmassey &lt; at &gt; obscorp.com) 22108 Harper Ave.
St. Clair Shores, MI 48080
Office: (800)750-4OBS (4627)
Cell: (586)945-8796

------------------------------------------------------------------------------
All the data continuously generated in your IT infrastructure contains a
definitive record of customers, application performance, security
threats, fraudulent activity and more. Splunk takes this data and makes
sense of it. Business sense. IT sense. Common sense.
[p.sf.net]
_______________________________________________
BackupPC-users mailing list
BackupPC-users < at > lists.sourceforge.net
List: [lists.sourceforge.net]
Wiki: [backuppc.wiki.sourceforge.net]
Project: [backuppc.sourceforge.net]
Search for File
September 28, 2011 08:10AM
Don't know if it's faster than your way or not, but I've used:
find -type f -name "*thing_i_want"
note you can use wildcards...

a.

On Wed, Sep 28, 2011 at 10:52 AM, Gerald Brandt <gbr < at > majentis.com> wrote:
Quote

Hi Tim,

That's basically what I did, but I have a couple of BackupPC users that have no clue about command line stuff, so I was hoping for a BackupPC web based solution.

Gerald

________________________________

From: "Timothy J Massey" <tmassey < at > obscorp.com>
To: "General list for user discussion, questions and support" <backuppc-users < at > lists.sourceforge.net>
Sent: Wednesday, September 28, 2011 9:30:18 AM
Subject: Re: [BackupPC-users] Search for File

Gerald Brandt <gbr < at > majentis.com> wrote on 09/28/2011 10:15:12 AM:

Quote

I need to search for a specific file on a host, via backuppc.  Is
there a way to search a host backup, so I don't have to manually go
through all directories via the web interface?
The easiest, most direct way of doing that would be:

cd /path/to/host/pc/directory
find . | grep "f<filename>"

I'm sure someone with more shell-fu will give you a much better command line (and I look forward to learning something!).  I'm sure there's a way to do it simply with the find command alone, but I've had limited success trying to limit the find command to find specific files.  For me, it's easier to use grep as above.  My way will work, if a bit slowly:  there's lots of files in there...

Don't forget the leading f in the filename:  BackupPC puts an f in front of every filename in the directory structure.

Tim Massey

Out of the Box Solutions, Inc.
Creative IT Solutions Made Simple!
[www.OutOfTheBoxSolutions.com]
tmassey < at > obscorp.com       22108 Harper Ave.
St. Clair Shores, MI 48080
Office: (800)750-4OBS (4627)
Cell: (586)945-8796

------------------------------------------------------------------------------
All the data continuously generated in your IT infrastructure contains a
definitive record of customers, application performance, security
threats, fraudulent activity and more. Splunk takes this data and makes
sense of it. Business sense. IT sense. Common sense.
[p.sf.net]
_______________________________________________
BackupPC-users mailing list
BackupPC-users < at > lists.sourceforge.net
List:    [lists.sourceforge.net]
Wiki:    [backuppc.wiki.sourceforge.net]
Project: [backuppc.sourceforge.net]

------------------------------------------------------------------------------
All the data continuously generated in your IT infrastructure contains a
definitive record of customers, application performance, security
threats, fraudulent activity and more. Splunk takes this data and makes
sense of it. Business sense. IT sense. Common sense.
[p.sf.net]
_______________________________________________
BackupPC-users mailing list
BackupPC-users < at > lists.sourceforge.net
List:    [lists.sourceforge.net]
Wiki:    [backuppc.wiki.sourceforge.net]
Project: [backuppc.sourceforge.net]

--
"The universe is probably littered with the one-planet graves of
cultures which made the sensible economic decision that there's no
good reason to go into space--each discovered, studied, and remembered
by the ones who made the irrational decision." - Randall Munroe

------------------------------------------------------------------------------
All the data continuously generated in your IT infrastructure contains a
definitive record of customers, application performance, security
threats, fraudulent activity and more. Splunk takes this data and makes
sense of it. Business sense. IT sense. Common sense.
[p.sf.net]
_______________________________________________
BackupPC-users mailing list
BackupPC-users < at > lists.sourceforge.net
List: [lists.sourceforge.net]
Wiki: [backuppc.wiki.sourceforge.net]
Project: [backuppc.sourceforge.net]
Search for File
September 28, 2011 08:23AM
On Wednesday 28 September 2011 16:30:18 Timothy J Massey wrote:
Quote

Gerald Brandt <gbr < at > majentis.com> wrote on 09/28/2011 10:15:12 AM:
Quote

I need to search for a specific file on a host, via backuppc. Is
there a way to search a host backup, so I don't have to manually go
through all directories via the web interface?
The easiest, most direct way of doing that would be:

cd /path/to/host/pc/directory
find . | grep "f<filename>"

I'm sure someone with more shell-fu will give you a much better command
line (and I look forward to learning something!).
Here you are:

find <path_where_to_start> -iname <string_to_search>

iname means case-insensitive, so you don't have to care about that.
if you want to search for a combination of directory and filename, you have to
think about the 'f' backuppc puts in front.

Using find you will realize that its rather slow and has your disk rattling
away. Better to use the indexing services, for example locate:

locate <string_to_search>

gives a list of hits. But only from the state when locate last rebuilt its
index (should happen daily/nightly). That is good enough to find files last seen
two weeks ago, but doesn't find that file you just downloaded and can't remember
where you saved it.

There are also disk-indexing services with web-frontends, htdig comes to mind.
That even finds stuff inside the files.

Have fun,

Arnold

------------------------------------------------------------------------------
All the data continuously generated in your IT infrastructure contains a
definitive record of customers, application performance, security
threats, fraudulent activity and more. Splunk takes this data and makes
sense of it. Business sense. IT sense. Common sense.
[p.sf.net]
_______________________________________________
BackupPC-users mailing list
BackupPC-users < at > lists.sourceforge.net
List: [lists.sourceforge.net]
Wiki: [backuppc.wiki.sourceforge.net]
Project: [backuppc.sourceforge.net]
Search for File
September 28, 2011 08:33AM
Arnold Krille <arnold < at > arnoldarts.de> wrote on 09/28/2011 11:20:57 AM:

Quote

Quote

I'm sure someone with more shell-fu will give you a much better command
line (and I look forward to learning something!).
Here you are:

find <path_where_to_start> -iname <string_to_search>
Now I remember why I stick with the grep form: remembering the different syntax of the find command. As a *not* old-time UNIX guru (but a long-time but not full-time *Linux* user), I think that any parameter of multiple letters (like -name) should have two dashes! smiling smiley I am often frustrated by the unusual find command syntax, so I simply stick with grep, which has many more uses beyond finding files.

Quote

Using find you will realize that its rather slow and has your disk rattling
away. Better to use the indexing services, for example locate:

locate <string_to_search>
Yeah, that's great if you update the locate database (as you mention). On a backup server, with millions of files and lots of work to do pretty much around the clock? That's one of the first things I disable! So no locate.

Timothy J. Massey
Out of the Box Solutions, Inc.
Creative IT Solutions Made Simple!

http://www.OutOfTheBoxSolutions.com
tmassey < at > obscorp.com 22108 Harper Ave.
St. Clair Shores, MI 48080
Office: (800)750-4OBS (4627)
Cell: (586)945-8796
Search for File
September 28, 2011 11:24AM
On Wednesday 28 September 2011 17:23:17 Timothy J Massey wrote:
Quote

Arnold Krille <arnold < at > arnoldarts.de> wrote on 09/28/2011 11:20:57 AM:
Quote

Using find you will realize that its rather slow and has your disk
rattling
Quote

away. Better to use the indexing services, for example locate:

locate <string_to_search>
Yeah, that's great if you update the locate database (as you mention). On
a backup server, with millions of files and lots of work to do pretty much
around the clock? That's one of the first things I disable! So no
locate.
You could limit locate to the paths you want to be indexed. Or you could
exclude the (c)pool of backuppc and still get the information.
And adding some minutes of updatedb indexing the filesystem-tree (its not even
indexing the contents) to BackupPC-nightly shouldn't hurt that much.

Have fun,

Arnold

------------------------------------------------------------------------------
All the data continuously generated in your IT infrastructure contains a
definitive record of customers, application performance, security
threats, fraudulent activity and more. Splunk takes this data and makes
sense of it. Business sense. IT sense. Common sense.
[p.sf.net]
_______________________________________________
BackupPC-users mailing list
BackupPC-users < at > lists.sourceforge.net
List: [lists.sourceforge.net]
Wiki: [backuppc.wiki.sourceforge.net]
Project: [backuppc.sourceforge.net]
Search for File
September 28, 2011 06:33PM
On Wed, 28 Sep 2011, Timothy J Massey wrote:

Quote

Arnold Krille <arnold < at > arnoldarts.de> wrote on 09/28/2011 11:20:57 AM:

Quote

Quote

I'm sure someone with more shell-fu will give you a much better
command
Quote

Quote

line (and I look forward to learning something!).
Here you are:

find <path_where_to_start> -iname <string_to_search>
...
Quote

Quote

Using find you will realize that its rather slow and has your disk
rattling
Quote

away. Better to use the indexing services, for example locate:

locate <string_to_search>
Yeah, that's great if you update the locate database (as you mention). On
a backup server, with millions of files and lots of work to do pretty much
around the clock? That's one of the first things I disable! So no
locate.
Hmmm.

When I want to search for a file (half the time I don't even know what
machine or from what time period, so I have to search the entire pool), I
look at the mounted backuppcfs fuse filesystem (I mount onto /snapshots):
[svn.ulyssis.org]

What if you let mlocate index into the /snapshots ?

I haven't tested to get it to index /snapshots, but mlocate doesn't index
into directories that haven't had a modified mtime. If backuppfs
correctly preserves mtimes for directories, then updatedb.mlocate will do
the right thing and be a lot quicker than regular old updatedb. Then make
sure that cron runs it at a time appropriate for you (when I was doing
night shift, this *wasn't* at 4am!), and you won't even notice that it's
busy.

Then wrap locate up in a simple cgi script to present to your users
instead of training them how to use locate on the commandline.

--
Tim Connors

------------------------------------------------------------------------------
All the data continuously generated in your IT infrastructure contains a
definitive record of customers, application performance, security
threats, fraudulent activity and more. Splunk takes this data and makes
sense of it. Business sense. IT sense. Common sense.
[p.sf.net]
_______________________________________________
BackupPC-users mailing list
BackupPC-users < at > lists.sourceforge.net
List: [lists.sourceforge.net]
Wiki: [backuppc.wiki.sourceforge.net]
Project: [backuppc.sourceforge.net]
Search for File
September 28, 2011 11:19PM
Am Mittwoch, 28. September 2011 schrieb Gerald Brandt:
Quote

Hi,

I need to search for a specific file on a host, via backuppc. Is there a
way to search a host backup, so I don't have to manually go through all
directories via the web interface?
Maybe another solution: you can simply look at the last full "xferlog" for
that host and use the browser search function to locate the desired file....
Then you can use the history function of backuppc to look at the different
versions of that file.

Bye, Bernd

------------------------------------------------------------------------------
All the data continuously generated in your IT infrastructure contains a
definitive record of customers, application performance, security
threats, fraudulent activity and more. Splunk takes this data and makes
sense of it. Business sense. IT sense. Common sense.
[p.sf.net]
_______________________________________________
BackupPC-users mailing list
BackupPC-users < at > lists.sourceforge.net
List: [lists.sourceforge.net]
Wiki: [backuppc.wiki.sourceforge.net]
Project: [backuppc.sourceforge.net]
Search for File
October 01, 2011 06:40PM
Tim Connors wrote at about 11:15:31 +1000 on Thursday, September 29, 2011:
Quote

On Wed, 28 Sep 2011, Timothy J Massey wrote:

Quote

Arnold Krille <arnold < at > arnoldarts.de> wrote on 09/28/2011 11:20:57 AM:

Quote

Quote

I'm sure someone with more shell-fu will give you a much better
command
Quote

Quote

line (and I look forward to learning something!).
Here you are:

find <path_where_to_start> -iname <string_to_search>
...
Quote

Quote

Using find you will realize that its rather slow and has your disk
rattling
Quote

away. Better to use the indexing services, for example locate:

locate <string_to_search>
Yeah, that's great if you update the locate database (as you mention). On
a backup server, with millions of files and lots of work to do pretty much
around the clock? That's one of the first things I disable! So no
locate.
Hmmm.

When I want to search for a file (half the time I don't even know what
machine or from what time period, so I have to search the entire pool), I
look at the mounted backuppcfs fuse filesystem (I mount onto /snapshots):
[svn.ulyssis.org]
I too would recommend backuppc-fuse - though the one disadvantage is
that it is a lot slower than a native search through the pc tree since
the directories need to be reconstructed from the relevant partials &
fulls (which is a *good* thing but slow).

------------------------------------------------------------------------------
All of the data generated in your IT infrastructure is seriously valuable.
Why? It contains a definitive record of application performance, security
threats, fraudulent activity, and more. Splunk takes this data and makes
sense of it. IT sense. And common sense.
[p.sf.net]
_______________________________________________
BackupPC-users mailing list
BackupPC-users < at > lists.sourceforge.net
List: [lists.sourceforge.net]
Wiki: [backuppc.wiki.sourceforge.net]
Project: [backuppc.sourceforge.net]
Search for File
October 01, 2011 06:59PM
Timothy J Massey wrote at about 10:30:18 -0400 on Wednesday, September 28, 2011:
Quote

Gerald Brandt <gbr < at > majentis.com> wrote on 09/28/2011 10:15:12 AM:

Quote

I need to search for a specific file on a host, via backuppc. Is
there a way to search a host backup, so I don't have to manually go
through all directories via the web interface?
The easiest, most direct way of doing that would be:

cd /path/to/host/pc/directory
find . | grep "f<filename>"
I think it would generally be faster to do:
find . -name "f<filename>"

This still may have a problem in that the f-mangling *also* converts
non-printable ascii characters (and also whitespace and /) into %<hex>
codes. So, if your filename contains any of those chars then you need
to change the search term to be written that way.

Also, you need to be careful about incrementals vs. fulls since incrementals
will include only the most recently changed files while fulls might
not include the latest version if there are subsequent incrementals.

You can avoid both of the above problems by using backuppc-fuse as
pointed out by another respondent, though it may be slower.

------------------------------------------------------------------------------
All of the data generated in your IT infrastructure is seriously valuable.
Why? It contains a definitive record of application performance, security
threats, fraudulent activity, and more. Splunk takes this data and makes
sense of it. IT sense. And common sense.
[p.sf.net]
_______________________________________________
BackupPC-users mailing list
BackupPC-users < at > lists.sourceforge.net
List: [lists.sourceforge.net]
Wiki: [backuppc.wiki.sourceforge.net]
Project: [backuppc.sourceforge.net]
Sorry, only registered users may post in this forum.

Click here to login