Batch to intelligently delete files and folders

Discussion forum for all Windows batch related topics.

Moderator: DosItHelp

Message
Author
foxidrive
Expert
Posts: 6031
Joined: 10 Feb 2012 02:20

Re: Batch to intelligently delete files and folders

#16 Post by foxidrive » 16 Nov 2014 19:00

jbseventeen wrote:Hmm...not quite since it even renames files <7 days old.

As I understood the task, the files going into the target folder manually aren't going to be there for long before this script runs.
This will put a 30 day expiry on them and in 30 days they won't exist.
#### 1. moved line "md "%target%" 2>nul" further down to prevent %target% from being created when there are no files to process ####

Again, the task mentioned that there were always going to be files in this folder.
If it's going to be empty for a couple of days then there is no damage from having an empty folder on the desktop.
) #### 2. removed parameter "/xd "%target%" " to allow processing of files in %target%\ ####

This breaks the code, as you have outlined the rules.
if filename=="*-del-on-*" #### 4. I'm embarrassed that I don't know how to write this. also prefix is being changed to suffix "*-del-on-yyyy-mm-dd.*" further down ####

A suffix needs more work to parse, and the usual solution breaks files with a ! in the name.
With a prefix the files also sort correctly by deletion date.

if you need to find a file with "this text" in the name you can use a batch file to show you the current full filename.

You mentioned that you have 80 GB HDD and these drives are also storing the data files, and that replacing them is not economical.
This leads me to think that the data is not backed up anywhere...

Samir
Posts: 384
Joined: 16 Jul 2013 12:00
Location: HSV
Contact:

Re: Batch to intelligently delete files and folders

#17 Post by Samir » 16 Nov 2014 21:50

jbseventeen wrote:@Samir
Glad I'm not alone. It is a lot of work but it could work. Will have to do a lot of research and testing first though.
It's a common issue when there's a handful of computers and inexperienced users. It took me about a day to play with steadystate and find the ways that is can be broken. But it was well worth that time investment since setting up an actual system was done in under an hour. 8)

Don't discount the idea of thin clients either. They're truly plug and play and can be put into production even faster than steadystate if you have little things to install or change.

I actually use a two pronged approach--using thin clients as my endpoint for users and then they rdp into steadystate boxes. This way, I'm able to blend how much is on the client and how much on the server (rdp server). I then have a file server which all the files are saved on. Has worked pretty well for a few years now. 8)

jbseventeen
Posts: 11
Joined: 13 Nov 2014 18:47

Re: Batch to intelligently delete files and folders

#18 Post by jbseventeen » 17 Nov 2014 05:14

As I understood the task, the files going into the target folder manually aren't going to be there for long before this script runs.
This will put a 30 day expiry on them and in 30 days they won't exist.

The idea is that you can have whatever you want, wherever you want in the work area, as long as you are continuously modifying the file. The filter should be applied only to files that have not been modified in 7 days. it helps to refine the scope of the filter to protect users from themselves.
#### 1. moved line "md "%target%" 2>nul" further down to prevent %target% from being created when there are no files to process ####

In an ideal scenario, the user would start organizing his files on his own, in which case, not having to see that goddamned folder would be the reward.
if you need to find a file with "this text" in the name you can use a batch file to show you the current full filename.

Could you elaborate? The only reason for changing it into a suffix is to make it easier for users to find their files so that they can organize them.
You mentioned that you have 80 GB HDD and these drives are also storing the data files, and that replacing them is not economical.
This leads me to think that the data is not backed up anywhere...

You're right that the data in the work area (in this case desktop) is not backed up anywhere. There is a partition on the drive where files are supposed to be stored. The warning at 7 days and deletion at 30 days is ample time to save needed files from deletion.

1. How about a separate step for files exclusively in the %target% folder? (This should also occur before the deletion process):

- if no prefix on filename
-- if modified date >=7 days
---add del in 30 days prefix

2. I notice that if archive.zip is extracted to desktop\archive\
the folder structure of \archive\ is lost during the operations.
Would it be possible to do something like this?:

-if file >7 days # e.g. desktop\archive\driver\winxp\driver.sys
--calculate %source% # which should be archive\driver\winxp\
--move %source%\file %target%\%source%\file #which would be desktop\marked-for-deletion\archive\driver\winxp\driver.sys

foxidrive
Expert
Posts: 6031
Joined: 10 Feb 2012 02:20

Re: Batch to intelligently delete files and folders

#19 Post by foxidrive » 17 Nov 2014 05:32

jbseventeen wrote:
(excluding shortcuts and the "Desktop\Marked-for-deletion\" folder)


I do see one potential issue that I had not thought of before :
1. Files in the "marked-for-deletion" folder that are simply renamed without the prefix would not get deleted.
2. Moving non-prefixed files into the marked-for-deletion folder grants them immunity.

This wasn't clear, as your rule to exclude the "Desktop\Marked-for-deletion\" folder was the primary focus of the task.

Now it seems like your users are putting files in a folder called "Marked-for-deletion" but they have nothing to do with being "Marked-for-deletion"
and the folder can be used for any purpose.
You're right that the data in the work area (in this case desktop) is not backed up anywhere. There is a partition on the drive where files are supposed to be stored. The warning at 7 days and deletion at 30 days is ample time to save needed files from deletion.

This isn't going to help when there is a HDD failure, which is what I was trying to point out in a roundabout way.
2. I notice that if archive.zip is extracted to desktop\archive\
the folder structure of \archive\ is lost during the operations.
Would it be possible to do something like this?:

I'm unsure how this is happening as my code doesn't have any feature to extract zip files. :D
jbseventeen wrote:In an ideal scenario, the user would start organizing his files on his own, in which case, not having to see that goddamned folder would be the reward.

You can add a line at the end

Code: Select all

rd "%target%" 2>nul


if you need to find a file with "this text" in the name you can use a batch file to show you the current full filename.

Could you elaborate?

This batch file can be used as follows: file.bat "this text" and it will list any file with "this text" in the filename:

Code: Select all

@echo off
dir "%target%\*%**" /b /s /a-d
pause

jbseventeen
Posts: 11
Joined: 13 Nov 2014 18:47

Re: Batch to intelligently delete files and folders

#20 Post by jbseventeen » 17 Nov 2014 17:17

Mgmt does not agree that hdd failure is a risk worth spending money on (for backup drives) so I try to mitigate what I can with S.M.A.R.T. monitoring (which I know is not always effective) and irregular manual backups. Not sure I understand the relevance though.

If it was me, it'd take me no time to realize that files in the target folder are excluded from processing and then this would all have been a waste of time.

The "archive.zip" was just an example :) And I doubt that any of the employees would go for a dos based search.

But i get it. My requirements are fastidious, to put it politely, so I guess I'll just have to spend the time learning to do this myself. Still can't thank you enough for all the help. Should speed up the learning process if nothing else.

foxidrive
Expert
Posts: 6031
Joined: 10 Feb 2012 02:20

Re: Batch to intelligently delete files and folders

#21 Post by foxidrive » 17 Nov 2014 20:55

jbseventeen wrote:Mgmt does not agree that hdd failure is a risk worth spending money on (for backup drives). Not sure I understand the relevance though.


It would seem from that statement that the files being created are worth nothing, if they are lost.

I'm wondering if the machines are on a LAN - and if the files have any value then copying them across the network to a backup store would be the minimum backup - a scheduled task could run to do that.


I'm responsible for backups here and so I'm a bit of a backup nazi, but I've had many people come to me with personal machines that have a failed HDD and ask about file recovery.
The cost of file recovery on a single HDD can be from around $500 to $5000 and the cost of a single 1TB backup drive is less than $100 - so the economics are really in favour of a backup scheme.

Squashman
Expert
Posts: 4471
Joined: 23 Dec 2011 13:59

Re: Batch to intelligently delete files and folders

#22 Post by Squashman » 17 Nov 2014 21:14

A wise man once told me that any data you do not have at least two backups of is data you don't care about. Where I work, saving data locally can get you fired. If you want something backed up you save it to the network.

jbseventeen
Posts: 11
Joined: 13 Nov 2014 18:47

Re: Batch to intelligently delete files and folders

#23 Post by jbseventeen » 18 Nov 2014 06:17

Oh they're worth something alright. But they only realize this for one fleeting moment when a hard drive does fail. Its a small engineering company and they want to spend as little money as possible on the "rabble".

It's worked out alright with only 2 seagate hard drives that came up with smart errors in the last 6 months so no losses. I've taken a policy of just avoiding seagate drives which seems to go a long way.

The LAN was dismantled due to renovations that are still ongoing. But yes I really ought to setup task scheduler to copy the drives to the server once its back up. This is where automation of dealing with garbage files would help. Hopefully I'll get some time to look into it.

foxidrive
Expert
Posts: 6031
Joined: 10 Feb 2012 02:20

Re: Batch to intelligently delete files and folders

#24 Post by foxidrive » 18 Nov 2014 06:38

jbseventeen wrote:Oh they're worth something alright. But they only realize this for one fleeting moment when a hard drive does fail. Its a small engineering company

They are an engineering company, and can't understand that electro-mechanical hardware can fail without warning? Remind me not to get them to design me anything. :D
I've taken a policy of just avoiding seagate drives which seems to go a long way.

Any brand of drive can fail.

In the car industry there used to be a joke that the 'lemons' of cars that broke down repeatedly were those ones that were built by management, on Fridays. ;)

Squashman wrote:A wise man once told me that any data you do not have at least two backups of is data you don't care about.

Amen to that!

Squashman
Expert
Posts: 4471
Joined: 23 Dec 2011 13:59

Re: Batch to intelligently delete files and folders

#25 Post by Squashman » 18 Nov 2014 07:32

A little story about hard drives.

I used to use Maxtor drives religiously. Had a pair running mirrored on my Linux server for several years without any issues.

I then went to build a new Windows Desktop for my wife. Ordered a Maxtor drive. DOA! Ordered another Maxtor drive. DOA! Went down to Best Buy picked up a Seagate and finally had the computer up and running after 3 weeks of HD madness. Stuff happens. Doesn't mean one is better than the other.

Samir
Posts: 384
Joined: 16 Jul 2013 12:00
Location: HSV
Contact:

Re: Batch to intelligently delete files and folders

#26 Post by Samir » 18 Nov 2014 08:48

jbseventeen wrote:Oh they're worth something alright. But they only realize this for one fleeting moment when a hard drive does fail. Its a small engineering company and they want to spend as little money as possible on the "rabble".

It's worked out alright with only 2 seagate hard drives that came up with smart errors in the last 6 months so no losses. I've taken a policy of just avoiding seagate drives which seems to go a long way.

The LAN was dismantled due to renovations that are still ongoing. But yes I really ought to setup task scheduler to copy the drives to the server once its back up. This is where automation of dealing with garbage files would help. Hopefully I'll get some time to look into it.
I think since the task of repairing this damage in case of an HD failure comes on you, you should just get a high capacity (256gb) flash drive or 1tb external HD and just back up each system manually at least once a week. This way, when there is a failure, you can just get the system back up from your backup and then act like you're working for a while as it would have been a lot more work to recover the file otherwise.

Samir
Posts: 384
Joined: 16 Jul 2013 12:00
Location: HSV
Contact:

Re: Batch to intelligently delete files and folders

#27 Post by Samir » 18 Nov 2014 08:51

Squashman wrote:A little story about hard drives.

I used to use Maxtor drives religiously. Had a pair running mirrored on my Linux server for several years without any issues.

I then went to build a new Windows Desktop for my wife. Ordered a Maxtor drive. DOA! Ordered another Maxtor drive. DOA! Went down to Best Buy picked up a Seagate and finally had the computer up and running after 3 weeks of HD madness. Stuff happens. Doesn't mean one is better than the other.
Yep, every manufacturer makes great drives and crap ones.

The key is to buy enterprise class drives with ridiculous mtbfs and warranties. The western digital RE series is a perfect example of that.

I used to be that SCSI drives were always the best in terms of reliability. But now that SCSI has also gone serial (SAS), the same drive hardware can be used, so it's hard to pick the good from the bad just like in SATA drives.

Post Reply