[SunHELP] SUMMARY: Cron jobs and NFS
Christopher Singleton
cas40 at bu.edu
Mon Sep 26 13:09:33 CDT 2005
Hello,
Got a few good replies in reference to doing cron jobs on NFS
shared folders, most stated that the cron job should be on the computer
that the files actually resides on if possible, since local operations
will always be faster than network ones. Thanks to Will Mcdonald,
Andrew Hay and Nadine for the following responses. My original post is
below with their replies. Thanks again,
Chris
Hello all,
>
>
> I have an SB 150 running a fully patched Solaris 9. I've set it
> up so that I can use NFS to export files to an SGI O2 runnning IRIX
> 6.5.27. Ideally, I would like to export all files from a particular
> directory that are less than 24 hours old, and remove files from the
> directory I exported to when they age over 24 hours (so no matter what
> time the user sees the directory, all he sees are the files that are 24
> hours old or less). My main quesiton is should I use cron to do this or
> is there some kind or mirroring that I can do that will only import
> files of a certain age or younger? And if I do use cron, should I
> delete the files using the crontab on the Sun box that is exporting the
> files, or should I set up a cron job on the SGI box that is
> importing/receiving the files? Is there a practical difference between
> the two? Thanks for any help, will summarize.
You could easily carry this out from cron, say once an hour. I'd
suggest doing the deletion on the NFS server as this will prevent
unecessary network IO. If you can carry out tasks on the local disk
it'll always be quicker.
$ find -name "*.txt" -mtime +0 -ls
Will list everything ending in .txt older than 24 hours.
$ find -name "*.txt" -mtime +0 -exec rm -f {} \;
Would delete everything ending in .txt older than 24 hours. As always,
make sure first that it's removing what you expect it to before you
actually do the rm. :)
Will.
i don't know offhand of any mirroring or synchronizing tool that'll do
what you want.
1. i'd set up a dummy directory on your src system.
2. then nfs-export the dummy dir. i think it'd have to be in the same
fs as the src dir so that hard file links can be used. and, only
the overhead for the directory itself would be needed.
3. i'd run a find on it to delete old file links. harmless if empty.
4. i'd run a find on the src dir to find new files and link them to
the dummy. this will fail harmlessly for links that already exist.
[3] and [4] would be in crontab on the exporting system.
this assumes you don't want to actually delete the old files, just
hide them from the importing system.
______________________________________________________
Andrew Hay
Unless there is a reason for NFS beyond the SGI, I'd recommend a
cron job locally to take out the older files, then rsync to copy the files
to the SGI. But, I work in a security-paranoid environment these days
and NFS is "right out".
If indeed you stick with NFS, then running the cron job on the machine
where the file system is local is a better idea. If you lose network
connectivity between the two boxes during the cron job's running,
having it going on the SGI could leave the file system with a mix of
old and new files.
=Nadine=
More information about the SunHELP
mailing list