Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I've biased towards this heavily in the last 8 or so years now.

I've yet to have anyone mistakenly modify anything when they need to pass --commit, when I've repeatedly had people repeatedly accidentally modify stuff because they forgot --dry-run.





I wouldn’t want most things to work this way:

    $ rm file.bin
    $ rm —-commit file.bin
    $ cat foo.txt > bar.txt
    $ cat foo.txt | tee —-write-for-real bar.txt
    $ cp balm.mp3 pow.mp3
    $ cp —-i-mean-it balm.mp3 pow.mp3
There is a time and a place for it but it should not be the majority of use cases.

Totally agree it shouldn't be for basic tools; but if I'm ever developing a script that performs any kind of logic before reaching out to a DB or vendor API and modifies 100k user records, creating a flag to just verify the sanity of the logic is a necessity.

    if [ -n "$DRY_RUN" ] ; then
        alias rm='echo rm'
        alias cp='echo cp'
    fi
Of course, output redirects will still overwrite the files, since the shell does it and IIRC this behaviour can't be changed.

set -o noclobber

Yep. First thing I do for this kind thing is make a preview=true flag so I don’t accidentally run destructive actions.

Now I like that idea as an environment variable that takes precedence over the command parameters.

For most of these local data manipulation type of commands, I'd rather just have them behave dangerously, and rely on filesystems snapshots to rollback when needed. With modern filesystems like zfs or btrfs, you can take a full snapshot every minute and keep it for a while to negate the damage done by almost all of these scripts. They double as a backup solution too.

I used to have alias rm='rm -i' for a few years to be careful, but I took it out once I realised that I had just begun adding -f all the time

Yeah, but that's because it's implemented poorly. It literally asks you to confirm deletion of each file individually, even for thousands of files.

What it should do is generate a user-friendly overview of what's to be deleted, by grouping files together by some criteria, e.g. by directory, so you'd only need to confirm a few times regardless of how many files you want to delete.


See also rm -I (capital i), which only prompts when deleting directories or >3 files

Even in those basic examples, it probably would be useful. `cp` to a blank file? No problem. `cp` over an existing file? Yeah, I want to be warned.

`rm` a single file? Fine. `rm /`? Maybe block that one.


That last one would error without doing anything anyway because it's not recursive.

Uhuh:

    $ rm -rf /

    rm: it is dangerous to operate recursively on '/'
    rm: use --no-preserve-root to override this failsafe

That's a special case that is a) easy to call accidentally from a script when variables end up being unset and b) almost never a sensible thing to do.

—dry-run should default to true




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: