[Date Prev][Date Next] [Thread Prev][Thread Next] [Date Index] [Thread Index]

Re: please use robots.txt for your gopher apps



> > 1) If there are parts that you don't want crawled, then use a robots.txt
> >    to define those,
> >
> > That's what Cameron is asking.
> 
> > I'm considering a policy requirement that sites to be accepted to the
> > new servers page must have some sort of robots.txt
> 
> Then I misread it I guess.

I've got a middle ground asking for this to be done voluntarily on the new
list, and we'll see how well that works.

-- 
------------------------------------ personal: http://www.cameronkaiser.com/ --
  Cameron Kaiser * Floodgap Systems * www.floodgap.com * ckaiser@floodgap.com
-- Seen on hand dryer: "Push button for a message from your congressman." -----


Reply to: