Hiya I have a little code snippet like this: Code: LINK=`/bin/grep -f http://www.spamcop.net/sc?id= $FILENAME` echo $LINK LINK=`/bin/fgrep http://www.spamcop.net/sc?id= $FILENAME` echo $LINK LINK=`/bin/ls $FILENAME` echo $LINK The output is this: Code: grep: http://www.spamcop.net/sc?id=: No such file or directory /bin/fgrep: line 2: exec: grep: not found 1147891663.9720_2.t390.greatnet.de:2,ST grep: http://www.spamcop.net/sc?id=: No such file or directory /bin/fgrep: line 2: exec: grep: not found 1147896741.16696_2.t390.greatnet.de:2,ST Why do grep and fgrep not work but the ls does?
They are on the same system but I have to use grep -F and not grep -f ^^ But I still wonder why fgrep does not work...
Not sure what you mean. The shell script looks like this now: Code: #!/bin/bash # ENTER PATH OF THE VERIFICATION EMAILS FROM SPAMCOP PATH="/home/mail/web4p1/Maildir/.Spamcop-Reply/cur" # ENTER WEBPATH TO PHP SCRIPT URL="http://www.roleplayer.org/spamcop.php" ################################################################# ################################################################# cd $PATH for FILENAME in *S do LINK=`/bin/grep -F http://www.spamcop.net/sc?id= $FILENAME` echo $LINK # LINK=`/bin/fgrep http://www.spamcop.net/sc?id= $FILENAME` # echo $LINK # LINK=`/bin/ls $FILENAME` # echo $LINK # lynx -dump $URL?link=$LINK # /bin/rm $FILENAME done
You can use grep with local files, e.g. Code: grep bla /path/to/example.txt , but not with URLs, like Code: grep bla http://www.example.com/example.txt
You could use wget to download the file then pipe it into grep. Code: wget -q -O - "http://www.example.com/example.txt" | grep bla