[nycbug-talk] direct file access denied via htaccess
max at neuropunks.org
Thu Jun 16 10:53:56 EDT 2005
And on top of this you can store your images in a db (yes lotsa overhead for busy sites) and generate them on the fly, with random names once session is established, and then unlink them once that session is terminated. Of course, you can only imagine the overhead involved..
Another point was to use tables, Ive seen people use the actual image as the background for a table, and then overlay 1pixel gif into that table, so right-clicking will save the pixel. Of course viewing source, and wget'ing is possible, but then you can start denying requests for any browser id besides ie/mozilla, but then i rewrite my browser id, and the game goes on..
You can also mess with jpeg headers to uniquely tag images (theres a jpeg perl module somewhere on span), or may be with steganography, to at least be able to track their spread through inet, provided people dont look for these things.
On Thu, Jun 16, 2005 at 10:30:13AM -0400, michael wrote:
> On Thu, 16 Jun 2005 10:12:28 -0400
> Steve Rieger <steve.rieger at tbwachiat.com> wrote:
> > i am not worried about people trying to grab the images. as they wont
> > even know that they exist, this site takes input from the user and puts
> > together a package that includes images and movies. and i do not want
> > somebody to be able to bookmark one of those movies or images and refer
> > to it at a later date.
> > hope i made myself clear.
> How about using a session and if it expired, you can not retrieve the files?
> Tell them; this is available for 2 minutes, or something like that. Have a server side script (like php or something similar) take the url string, interpret it, check it against the server session, and make a decision.
> % NYC*BUG talk mailing list
> %Be sure to check out our Jobs and NYCBUG-announce lists
> %We meet the first Wednesday of the month
More information about the talk