请教一句.htaccess的什么是url重定向向问题

您现在的位置: &
Apache URL重定向指南
Apache URL重定向指南
mod_rewrite入门Apache mod_rewrite模块是一个处理URL而又极为复杂的模块,使用mod_rewrite你可处理所有和URL有关的问题,你所付出的就是花时间去了解mod_rewrite的复杂架构,一般初学者都很难实时理解mod_rewrite的用法,有时Apache专家也要mod_rewrite来发展Apache的新功能。换句话说,当你成功使用mod_rewrite做到你期望的东西,就不要试图再接触mod_rewrite了,因为mod_rewrite的功能实在过于强大。本章的例子会介绍几个成功的例子给你摸索,不像FAQ形式般把你的问题解答。实用解决方法这里还有很多未被发掘的解决方法,请大家耐心地学习如何使用mod_rewrite。注意: 由于各人的服务器的配置都有所不同,你可能要更改设定来测试以下例子,例如使用mod_alias和mod_userdir时要加上[PT],或者使用.htaccess来重定向而非主设定文件等,请尽量理解各例子如何运作,不要生吞活剥地背诵。URL规划正规URL描述:在某些网页服务器中,一项资源可能会有数个URL,通常都会公布一正规URL(即真正发放的URL),其它URL都会被视为快捷方式或只供内部使用等,无论用户在使用快捷方式或正规URL,用户最后所重定向到的URL必需为正规。方法:我们可将所有非正规的URL重定向至正规的URL中,以下例子把非正规的「/~user」换成正规的「/u/user」,并且加上「/」号结尾。.RewriteRule ^/~([^/]+)/?(.*) /u/$1/$2 [R]RewriteRule ^/([uge])/([^/]+)$ /$1/$2/ [R]正规主机名称描述:(省略)方法:RewriteCond %{HTTP_HOST} !^fully\.qualified\.domain\.name [NC]RewriteCond %{HTTP_HOST} !^$RewriteCond %{SERVER_PORT} !^80$RewriteRule ^/(.*) {SERVER_PORT}/$1 [L,R]RewriteCond %{HTTP_HOST} !^fully\.qualified\.domain\.name [NC]RewriteCond %{HTTP_HOST} !^$RewriteRule ^/(.*) $1 [L,R]DocumentRoot被移动描述:URL的「/」通常都会映像到DocumentRoot上,但DocumentRoot有时并非重始就限定在某个目录上,它可能只是一个或多个目录的对照而矣。例如我们的内联网址为/e/www/ (WWW的主目录)和/e/sww/ (内联网的主目录)等等,因为所有的网页资料都放在/e/www/目录内,我们要确定所有内嵌的图像都能正确显示。方法:我们只要把「/」重定向至「/e/www/」,用mod_rewrite来解决比用mod_alias来解决更为简洁,因为URL别名只会比较URL的前部分,但重定向因可能涉及另一台服务器而需要不同的前缀部分(前缀部分已受DocumentRoot限制),所以mod_rewrite是最好的解决方法::RewriteEngine onRewriteRule ^/$ /e/www/ [R]结尾斜线问题描述:每个网主都曾受到结尾斜线问题的折磨,若在URL中没有结尾斜线,服务器就会认为URL无效并返回错误,因为服务器会根据/~quux/foo去寻找foo这个档案,而非显示这个目录。其实很多时候,这问题应留待用户自己加「/」去解决,但有时你也可以完成步骤。例如你做了多次URL重定向,而目的地为一个CGI程序。方法:最直观的方法就是令Apache自动加上「/」,使用外部重定向令浏览器能正确找到档案,若我们只做内部重定向,就只能正确显示目录页,在这目录页的图像文件会因相对URL的问题而找不到。例如我们请求/~quux/foo/l的image.gif时,重定向后会变成/~quux/image.gif。所以我们应使用以下方法:RewriteEngine onRewriteBase /~quux/RewriteRule ^foo$ foo/ [R]这方法也适用于.htaccess文件在各目录内设定,但这设定会覆盖原先主配置文件。RewriteEngine onRewriteBase /~quux/RewriteCond %{REQUEST_FILENAME} -dRewriteRule ^(.+[^/])$ $1/ [R]利用均一的URL版面规划网络群组描述:所有的网页服务器都有相同的URL版面,即无论用户向哪个主机发出请求URL,用户都会接收到相同的网页,使URL独立于服务器本身。我们的目的在于如何在Apache服务器不能响应时,都能有一个常规(而又独立于服务器运作)的网页传送给用户,设立网络群组可将这网页送至远程。方法:首先,服务器需要一外部文件把网站的用户、用户组及其它资料存储,这文件的格式如下user1 server_of_user1user2 server_of_user2: :把以上资料存入map.xxx-to-host。然后指示服务器把URL重定向,由/u/user/anypath/g/group/anypath/e/entity/anypath至<//physical-host/u/user/anypath<//physical-host/g/group/anypath<//physical-host/e/entity/anypath当服务器接收到不正确的URL时,服务器会跟随以下指示把URL映像到特定的档案(若URL并没有相对应的记录,就会重定向至 server0 上):RewriteEngine onRewriteMap user-to-host txt:/path/to/map.user-to-hostRewriteMap group-to-host txt:/path/to/map.group-to-hostRewriteMap entity-to-host txt:/path/to/map.entity-to-hostRewriteRule ^/u/([^/]+)/?(.*)//${user-to-host:$1server0}/u/$1/$2RewriteRule ^/g/([^/]+)/?(.*)//${group-to-host:$1server0}/g/$1/$2RewriteRule ^/e/([^/]+)/?(.*)//${entity-to-host:$1server0}/e/$1/$2RewriteRule ^/([uge])/([^/]+)/?$ /$1/$2/.www/RewriteRule ^/([uge])/([^/]+)/([^.]+.+) /$1/$2/.www/$3\把主目录移到新的网页服务器描述:有很多网主都有以下问题:在升级时把所有用户主目录由旧的服务器移到新的服务器上。方法:使用mod_rewrite可以简单地解决这问题,把所有/~user/anypathURL重定向//newserver/~user/anypath。RewriteEngine onRewriteRule ^/~(.+)//newserver/~$1 [R,L]结构化用户主目录描述:拥有大量用户的主机通常都会把用户目录规划好,将这些目录归入一个父目录中,然后再将用户的第一个字母作该用户的父目录,例如/~foo/anypath将会是/home/f/foo/.www/anypath,而/~bar/anypath就是/home/b/bar/.www/anypath。方法:按以下指令将URL直接对映到档案系统中。RewriteEngine onRewriteRule ^/~(([a-z])[a-z0-9]+)(.*) /home/$2/$1/.www$3重新组织档案系统描述:这是一个麻烦的例子:在不用更动现有目录结构下,使用RewriteRules来显示整个目录结构。背景:net.sw是一个装满Unix免费软件的资料夹,并以下列结构存储:drwxrwxr-x 2 netsw users 512 Aug 3 18:39 Audio/drwxrwxr-x 2 netsw users 512 Jul 9 14:37 Benchmark/drwxrwxr-x 12 netsw users 512 Jul 9 00:34 Crypto/drwxrwxr-x 5 netsw users 512 Jul 9 00:41 Database/drwxrwxr-x 4 netsw users 512 Jul 30 19:25 Dicts/drwxrwxr-x 10 netsw users 512 Jul 9 01:54 Graphic/drwxrwxr-x 5 netsw users 512 Jul 9 01:58 Hackers/drwxrwxr-x 8 netsw users 512 Jul 9 03:19 InfoSys/drwxrwxr-x 3 netsw users 512 Jul 9 03:21 Math/drwxrwxr-x 3 netsw users 512 Jul 9 03:24 Misc/drwxrwxr-x 9 netsw users 512 Aug 1 16:33 Network/drwxrwxr-x 2 netsw users 512 Jul 9 05:53 Office/drwxrwxr-x 7 netsw users 512 Jul 9 09:24 SoftEng/drwxrwxr-x 7 netsw users 512 Jul 9 12:17 System/drwxrwxr-x 12 netsw users 512 Aug 3 20:15 Typesetting/drwxrwxr-x 10 netsw users 512 Jul 9 14:08 X11/我们打算把这个资料夹公开,而且希望直接地显示这资料夹的目录结构,但是我们又不想更改现有目录架构来迁就,加上我们打算开放给FTP,所以不想加入任何网页或CGI程序到这个资料夹中。方法:本方法分为两部分:第一部份是编写一系列的CGI程序来显示目录结构,这例子会把CGI和刚才的资料夹放进/e/netsw/.www/:-rw-r--r-- 1 netsw users 1318 Aug 1 18:10 .wwwacldrwxr-xr-x 18 netsw users 512 Aug 5 15:51 DATA/-rw-rw-rw- 1 netsw users 372982 Aug 5 16:35 LOGFILE-rw-r--r-- 1 netsw users 659 Aug 4 09:27 TODO-rw-r--r-- 1 netsw users 5697 Aug 1 18:01 l-rwxr-xr-x 1 netsw users 579 Aug 2 10:33 netsw-access.pl-rwxr-xr-x 1 netsw users 1532 Aug 1 17:35 netsw-changes.cgi-rwxr-xr-x 1 netsw users 2866 Aug 5 14:49 netsw-home.cgidrwxr-xr-x 2 netsw users 512 Jul 8 23:47 netsw-img/-rwxr-xr-x 1 netsw users 24050 Aug 5 15:49 netsw-lsdir.cgi-rwxr-xr-x 1 netsw users 1589 Aug 3 18:43 netsw-search.cgi-rwxr-xr-x 1 netsw users 1885 Aug 1 17:41 netsw-tree.cgi-rw-r--r-- 1 netsw users 234 Jul 30 16:35 netsw-unlimit.lstDATA/子目录就是刚才的资料夹,net.sw内的软件会经rdist程序来自动更新。第二部份将这资料夹和新建立的CGI、网页配合,我们想将DATA/稳藏起来,而在用户请求不同URL时执行正确的CGI程序来显示。先将/net.sw/这URL重定向至/e/netsw:RewriteRule ^net.sw$ net.sw/ [R]RewriteRule ^net.sw/(.*)$ e/netsw/$1第一条规则纯粹补加URL结尾的「/」号,而第二条规则就是把URL重定向。之后将下列配置存入/e/netsw/.www/.wwwacl:Options ExecCGI FollowSymLinks Includes MultiViews RewriteEngine on# we are reached via /net.sw/ prefixRewriteBase /net.sw/# first we rewrite the root dir to # the handling cgi scriptRewriteRule ^$ netsw-home.cgi [L]RewriteRule ^l$ netsw-home.cgi [L]# strip out the subdirs when# the browser requests us from perdir pagesRewriteRule ^.+/(netsw-[^/]+/.+)$ $1 [L]# and now break the rewriting for local filesRewriteRule ^netsw-home\.cgi.* - [L]RewriteRule ^netsw-changes\.cgi.* - [L]RewriteRule ^netsw-search\.cgi.* - [L]RewriteRule ^netsw-tree\.cgi$ - [L]RewriteRule ^l$ - [L]RewriteRule ^netsw-img/.*$ - [L]# anything else is a subdir which gets handled# by another cgi scriptRewriteRule !^netsw-lsdir\.cgi.* - [C]RewriteRule (.*) netsw-lsdir.cgi/$1提示:1. 留意第四部份的L(last)旗标及代表不用更改的('-')符号2. 留意最后部份第一条规则的 ! (not)字符,及 C (chain) 链接符3. 留意最后一条规则代表全部更新的语法以Apache的mod_imap取代NCSA的imagemap描述:很多人都想顺利地把旧的NCSA服务器迁至新的Apache服务器,所以我们都想将旧的NCSA imagemap顺利转换到Apache的mod_imap,问题是imagemap已被很多超级链接连系着,但旧的imagemap是存储在/cgi-bin/imagemap/path/to/page.map,而在Apache却是放在/path/to/page.map。方法:我们只要将「/cgi-bin/」移除便可:RewriteEngine onRewriteRule ^/cgi-bin/imagemap(.*) $1 [PT]在多个目录下搜寻网页描述:MultiViews亦不能指示Apache在多个目录里搜寻网页。方法:请参看以下指令。RewriteEngine on# first try to find it in custom/...# ...and if found stop and be happy:RewriteCond /your/docroot/dir1/%{REQUEST_FILENAME} -fRewriteRule ^(.+) /your/docroot/dir1/$1 [L]# second try to find it in pub/...# ...and if found stop and be happy:RewriteCond /your/docroot/dir2/%{REQUEST_FILENAME} -fRewriteRule ^(.+) /your/docroot/dir2/$1 [L]# else go on for other Alias or ScriptAlias directives,# etc.RewriteRule ^(.+) - [PT]跟据URL设定环境变量描述:在页面间传递讯息可以用CGI程序完成,但你却不想用CGI而用URL来传递。方法:以下指令将变量及其值抽出URL外,然后记入自设的环境变量中,该变量可由XSSI或CGI存取。例如把/foo/S=java/bar/转换为/foo/bar/,然后把「java」写入环境变量「STATUS」。RewriteEngine onRewriteRule ^(.*)/S=([^/]+)/(.*) $1/$3 [E=STATUS:$2]虚拟用户主机描述:你只想根据DNS记录将www.usernam的请求直接对映到档案系统,放弃使用Apache的虚拟主机功能。方法:只有HTTP/1.1请求才可用以下方法做到,我们可根据HTTP Header把print "\n";print "&html&\n";print "&head&\n";print "&title&302 Moved Temporarily (EXTENDED)&/title&\n";print "&/head&\n";print "&body&\n";print "&h1&Moved Temporarily (EXTENDED)&/h1&\n";print "The document has moved &a HREF=\"$url\"&here&/a&.&p&\n";print "&/body&\n";print "&/html&\n";##EOF##这样可将所有能或不能直接用mod_rewrite来重定向的URL,经CGI来完成了。例如你可将某URL重定向至新闻服务器RewriteRule ^anyurl xredirect:news:newsgroup注意:你不需在每条规则后加上[R]或[R,L]。多样化资料夹存取描述:若你曾浏览RewriteRule ^l$ fol在07:00-19:00就显示fol,其余时间则显示l保留旧有文件的URL描述:更改文件的扩展名后,如何让旧的URL能对映到这新的文件。方法:把旧的URL用mod_rewrite重定向至新的文件,若有正确的新文件就对映到这文件,没有的话便对映到原有文件。# backward compatibility ruleset for # rewriting l to document.phtml# when and only when document.phtml exists# but no longer lRewriteEngine onRewriteBase /~quux/# parse out basename, but remember the factRewriteRule ^(.*)l$ $1 [C,E=WasHTML:yes]# rewrite to document.phtml if existsRewriteCond %{REQUEST_FILENAME}.phtml -fRewriteRule ^(.*)$ $1.phtml [S=1]# else reverse the previous basename cutoutRewriteCond %{ENV:WasHTML} ^yes$RewriteRule ^(.*)$ $l内容控制由旧的档名转到新的文件名 (档案系统)描述:假设我们将l改名为l,而我们又想保留旧有的URL,甚至不想给用户新的URL去连至这新档案。方法:将旧的档案对映到新的档案:RewriteEngine onRewriteBase /~quux/RewriteRule ^l$ l由旧的档名转到新的档名 (URL)描述:和刚才的例子一样,我们把l改名为l,但这次我们想直接将用户的网页重定向至新的文件,即浏览器的URL位置有所改变。方法:强制性将URL对映到新的URL:RewriteEngine onRewriteBase /~quux/RewriteRule ^l$ l [R]由浏览器种类控制内容描述:一个出色的网页应能支持各种浏览器,例如我们要把完整版网页传送至Netscape,但就要传送文字版至Lynx。方法:由于浏览器没有提供Apache 格式的浏览器种类资料,所以我们不可使用内文转换(mod_negotiation),我们必需用「User-Agent」决定浏览器种类。例如User -Agent为「Mozilla/3」就把「l」重定向至「fol」;若浏览器为「Lynx」或「Mozilla」就重定向至fol,其它种类的浏览器则导向至fol:RewriteCond %{HTTP_USER_AGENT} ^Mozilla/3.*RewriteRule ^l$ fol [L]RewriteCond %{HTTP_USER_AGENT} ^Lynx/.* [OR]RewriteCond %{HTTP_USER_AGENT} ^Mozilla/[12].*RewriteRule ^l$ fol [L]RewriteRule ^l$ fol [L]动态本地档案更新(经镜像网站)描述:你想将某个主机的网页连结到你的网页目录,若被连结的是FTP服务器,你可用mirror程序将最新的档案移到自己的主机上,我们可用webcopy经网页服务器HTTP把档案下载,但这方法有一坏处:只有在执行webcopy时才能更新档案。更好的办法就是在发出请求时立刻找寻最新的档案来源,然后实时下载到自己主机中。方法:利用Proxy Throughput(flag [P])把远程网页甚至整个网站建立一直接对照。RewriteEngine onRewriteBase /~quux/RewriteRule ^hotsheet/(.*)$
[P]动态镜像档案更新(经本主机)描述:(省略)方法:RewriteEngine onRewriteCond /mirror/of/remotesite/$1 -U RewriteRule ^有一个lbnamed程序专责利用域名服务器把用户请求分发到不同的服务器上,这是一个用Perl 5及其它附助工具写的复杂DNS工作量分配程序。3. 代理服务器循环建立机制我们使用mod_rewrite及其代理服务器网页记录(proxy throughput)功能,先在DNS加入www即是的记录。www IN CNAME www.然后将www变为一独立代理服务器,即是建立一专责代理服务器,然后把请求分流至五部不同的服务器(www1-www5),我们用lb.pl及以下mod_rewrite规则:RewriteEngine onRewriteMap lb prg:/path/to/lb.plRewriteRule ^/(.+)$ ${lb:$1} [P,L]lb.pl的程序代码:#!/path/to/perl#### lb.pl -- load balancing script##$ = 1;$name = "www"; # the hostname base$first = 1; # the first server (not 0 here, because 0 is myself) $last = 5; # the last server in the round-robin$domain = "foo.dom"; # the domainname$cnt = 0;while (&STDIN&) {$cnt = (($cnt+1) % ($last+1-$first));$server = sprintf("%s%d.%s", $name, $cnt+$first, $domain);print //$server/$_";}##EOF##注意,www这服务器的工作量仍然和以前一样高,但这服务器的工作就只是负责分流,所有SSI、CGI、ePerl请求都由其它服务器执行,所以整体的工作量已经减少了许多。4. 硬件/TCP循环机制可Cisco的LocalDirector在TCP/IP网络层上把用户请求分流,事实上这种分流程序已刻烙在是电路板上。与硬件有关的解决方法通常都需要大量的金钱,但执行效率就会是最高。将请求重定向至代理服务器描述:(省略)方法:#### nf -- Apache configuration for Reverse Proxy Usage### server typeServerType standalonePort 8000MinSpareServers 16StartServers 16MaxSpareServers 16MaxClients 16MaxRequestsPerChild 100# server operation parametersKeepAlive onMaxKeepAliveRequests 100KeepAliveTimeout 15Timeout 400IdentityCheck offHostnameLookups off# paths to runtime filesPidFile /path/to/apache-rproxy.pidLockFile /path/to/apache-rproxy.lockErrorLog /path/to/apache-rproxy.elogCustomLog /path/to/apache-rproxy.dlog "%{%v/%T}t %h -& %{SERVER}e URL: %U"# unused pathsServerRoot /tmpDocumentRoot /tmpCacheRoot /tmpRewriteLog /dev/nullTransferLog /dev/nullTypesConfig /dev/nullAccessConfig /dev/nullResourceConfig /dev/null# speed up and secure processing&Directory /&Options -FollowSymLinks -SymLinksIfOwnerMatchAllowOverride None&/Directory&# the status page for monitoring the reverse proxy&Location /apache-rproxy-status&SetHandler server-status&/Location&# enable the URL rewriting engineRewriteEngine onRewriteLogLevel 0# define a rewriting map with value-lists where# mod_rewrite randomly chooses a particular valueRewriteMap server rnd:/path/to/nf-servers# make sure the status page is handled locally# and make sure no one uses our proxy except ourselfRewriteRule ^/apache-rproxy-status.* - [L]RewriteRule ^(httpftp)://.* - [F]# now choose the possible servers for particular URL typesRewriteRule ^/(.*\.(cgishtml))$ to://${server:dynamic}/$1 [S=1]RewriteRule ^/(.*)$ to://${server:static}/$1 # and delegate the generated URL by passing it # through the proxy moduleRewriteRule ^to://([^/]+)/(.*)//$1/$2 [E=SERVER:$1,P,L]# and make really sure all other stuff is forbidden # when it should survive the above rules...RewriteRule .* - [F]# enable the Proxy module without cachingProxyRequests onNoCache *# setup URL reverse mapping for redirect reponsesProxyPassReverse / Solution:We just rewrite the URL to the CGI-script and force the correct MIME-type so it gets really run as a CGI-script. This way a request to /~quux/l internally leads to the invokation of /~quux/foo.cgi.RewriteEngine onRewriteBase /~quux/RewriteRule ^l$ foo.cgi [T=application/x-httpd-cgi]On-the-fly Content-RegenerationDescription:Here comes a really esoteric feature: Dynamically generated but statically served pages, i.e. pages should be delivered as pure static pages (read from the filesystem and just passed through), but they have to be generated dynamically by the webserver if missing. This way you can have CGI-generated pages which are statically served unless one (or a cronjob) removes the static contents. Then the contents gets refreshed.Solution:This is done via the following ruleset:RewriteCond %{REQUEST_FILENAME} !-sRewriteRule ^l$ page.cgi [T=application/x-httpd-cgi,L]Here a request to l leads to a internal run of a corresponding pagl is still missing or has filesize null. The trick here is that page.cgi is a usual CGI script which (additionally to its STDOUT) writes its output to the file l. Once it was run, the server sends out the data of l. When the webmaster wants to force a refresh the contents, he just removes l (usually done by a cronjob).Document With AutorefreshDescription:Wouldn't it be nice while creating a complex webpage if the webbrowser would automatically refresh the page every time we write a new version from within our editor? Impossible?Solution:No! We just combine the MIME multipart feature, the webserver NPH feature and the URL manipulation power of mod_rewrite. First, we establish a new URL feature: Adding just :refresh to any URL causes this to be refreshed every time it gets updated on the filesystem.RewriteRule ^(/[uge]/[^/]+/?.*):refresh /internal/cgi/apache/nph-refresh?f=$1Now when we reference the URL/u/foo/bar/l:refreshthis leads to the internal invocation of the URL/internal/cgi/apache/nph-refresh?f=/u/foo/bar/lThe only missing part is the NPH-CGI script. Although one would usually say "left as an exercise to the reader" ;-) I will provide this, too.#!/sw/bin/perl#### nph-refresh -- NPH/CGI script for auto refreshing pages## Copyright (c) 1997 Ralf S. Engelschall, All Rights Reserved. ##$ = 1;# split the QUERY_STRING variable@pairs = split(/&/, $ENV{'QUERY_STRING'});foreach $pair (@pairs) {($name, $value) = split(/=/, $pair);$name =~ tr/A-Z/a-z/;$name = 'QS_' . $$value =~ s/%([a-fA-F0-9][a-fA-F0-9])/pack("C", hex($1))/eval "\$$name = \"$value\"";}$QS_s = 1 if ($QS_s eq '');$QS_n = 3600 if ($QS_n eq '');if ($QS_f eq '') {print "HTTP/1.0 200 OK\n";print "Content-type: text/html\n\n";print "&b&ERROR&/b&: No file given\n";exit(0);}if (! -f $QS_f) {print "HTTP/1.0 200 OK\n";print "Content-type: text/html\n\n";print "&b&ERROR&/b&: File $QS_f not found\n";exit(0);}sub print_http_headers_multipart_begin {print "HTTP/1.0 200 OK\n";$bound = "ThisRandomString12345";print "Content-type: multipart/x-mixed-boundary=$bound\n";&print_http_headers_multipart_}sub print_http_headers_multipart_next {print "\n--$bound\n";}sub print_http_headers_multipart_end {print "\n--$bound--\n";}sub displayhtml {local($buffer) = @_;$len = length($buffer);print "Content-type: text/html\n";print "Content-length: $len\n\n";print $}sub readfile {local($file) = @_;local(*FP, $size, $buffer, $bytes);($x, $x, $x, $x, $x, $x, $x, $size) = stat($file);$size = sprintf("%d", $size);open(FP, "&$file");$bytes = sysread(FP, $buffer, $size);close(FP);return $}$buffer = &readfile($QS_f);&print_http_headers_multipart_&displayhtml($buffer);sub mystat {local($file) = $_[0];local($time);($x, $x, $x, $x, $x, $x, $x, $x, $x, $mtime) = stat($file);return $}$mtimeL = &mystat($QS_f);$mtime = $for ($n = 0; $n & $QS_n; $n++) {while (1) {$mtime = &mystat($QS_f);if ($mtime ne $mtimeL) {$mtimeL = $sleep(2);$buffer = &readfile($QS_f);&print_http_headers_multipart_&displayhtml($buffer);sleep(5);$mtimeL = &mystat($QS_f);}sleep($QS_s);}}&print_http_headers_multipart_exit(0);##EOF##Mass Virtual HostingDescription:The &VirtualHost& feature of Apache is nice and works great when you just have a few dozens virtual hosts. But when you are an ISP and have hundreds of virtual hosts to provide this feature is not the best choice.Solution:To provide this feature we map the remote webpage or even the complete remote webarea to our namespace by the use of the Proxy Throughput feature (flag [P]):#### vhost.map ## www.vhost1.dom:80 /path/to/docroot/vhost1www.vhost2.dom:80 /path/to/docroot/vhost2:www.vhostN.dom:80 /path/to/docroot/vhostN#### ##:# use the canonical hostname on redirects, etc.UseCanonicalName on:# add the virtual host in front of the CLF-formatCustomLog /path/to/access_log "%{VHOST}e %h %l %u %t \"%r\" %&s %b":# enable the rewriting engine in the main serverRewriteEngine on# define two maps: one for fixing the URL and one which defines# the available virtual hosts with their corresponding# DocumentRoot.RewriteMap lowercase int:tolowerRewriteMap vhost txt:/path/to/vhost.map# Now do the actual virtual host mapping# via a huge and complicated single rule:## 1. make sure we don't map for common locationsRewriteCond %{REQUEST_URI} !^/commonurl1/.*RewriteCond %{REQUEST_URI} !^/commonurl2/.*:RewriteCond %{REQUEST_URI} !^/commonurlN/.*## 2. make sure we have a Host header, because# currently our approach only supports # virtual hosting through this headerRewriteCond %{HTTP_HOST} !^$## 3. lowercase the hostnameRewriteCond ${lowercase:%{HTTP_HOST}NONE} ^(.+)$## 4. lookup this hostname in vhost.map and# remember it only when it is a path # (and not "NONE" from above)RewriteCond ${vhost:%1} ^(/.*)$## 5. finally we can map the URL to its docroot location # and remember the virtual host for logging puposesRewriteRule ^/(.*)$ %1/$1 [E=VHOST:${lowercase:%{HTTP_HOST}}]: Access RestrictionBlocking of RobotsDescription:How can we block a really annoying robot from retrieving pages of a specific webarea? A /robots.txt file containing entries of the "Robot Exclusion Protocol" is typically not enough to get rid of such a robot.Solution:We use a ruleset which forbids the URLs of the webarea /~quux/foo/arc/ (perhaps a very deep directory indexed area where the robot traversal would create big server load). We have to make sure that we forbid access only to the particular robot, i.e. just forbidding the host where the robot runs is not enough. This would block users from this host, too. We accomplish this by also matching the User-Agent HTTP header information.RewriteCond %{HTTP_USER_AGENT} ^NameOfBadRobot.* RewriteCond %{REMOTE_ADDR} ^123\.45\.67\.[8-9]$RewriteRule ^/~quux/foo/arc/.+ - [F]Blocked Inline-ImagesDescription:Assume we have under RewriteRule ^inlined-in-foo\.gif$ - [F]Host DenyDescription:How can we forbid a list of externally configured hosts from using our server?Solution:For Apache &= 1.3b6:RewriteEngine onRewriteMap hosts-deny txt:/path/to/hosts.denyRewriteCond ${hosts-deny:%{REMOTE_HOST}NOT-FOUND} !=NOT-FOUND [OR]RewriteCond ${hosts-deny:%{REMOTE_ADDR}NOT-FOUND} !=NOT-FOUNDRewriteRule ^/.* - [F]For Apache &= 1.3b6:RewriteEngine onRewriteMap hosts-deny txt:/path/to/hosts.denyRewriteRule ^/(.*)$ ${hosts-deny:%{REMOTE_HOST}NOT-FOUND}/$1RewriteRule !^NOT-FOUND/.* - [F]RewriteRule ^NOT-FOUND/(.*)$ ${hosts-deny:%{REMOTE_ADDR}NOT-FOUND}/$1 RewriteRule !^NOT-FOUND/.* - [F]RewriteRule ^NOT-FOUND/(.*)$ /$1#### hosts.deny #### ATTENTION! This is a map, not a list, even when we treat it as such.## mod_rewrite parses it for key/value pairs, so at least a## dummy value "-" must be present for each entry.##193.102.180.41 -bsdti1.sdm.de -192.76.162.40 -URL-Restricted ProxyDescription:How can we restrict the proxy to allow access to a configurable set of internet sites only? The site list is extracted from a prepared bookmarks file.Solution:We first have to make sure mod_rewrite is below(!) mod_proxy in the Configuration file when compiling the Apache webserver (or in the AddModule list of ), as it must get called _before_ mod_proxy.For simplicity, we generate the site list as a textfile map (but see the mod_rewrite documentation for a conversion script to DBM format). A typical Netscape bookmarks file can be converted to a list of sites with a shell script like this:#!/bin/shcat ${1:-~scape/l} tr -d '\015' tr '[A-Z]' '[a-z]' grep href=\" sed -e '/file:/d;' -e '/news:/d;' \-e 's^.*[^:]*://\([^:/"]*\).*$\1 OK;' \-e '//s^.*\([^:/"]*\).*$\1 OK;' sort -uWe redirect the resulting output into a text file called goodsites.txt. It now looks similar to this: OKxm OK OKper OK...We reference this site file within the configuration for the VirtualHost which is responsible for serving as a proxy (often not port 80, but 81, 8080 or 8008).&VirtualHost *:8008&...RewriteEngine On# Either use the (plaintext) allow list from goodsites.txtRewriteMap ProxyAllow txt:/usr/local/apache/conf/goodsites.txt# Or, for faster access, convert it to a DBM database:#RewriteMap ProxyAllow dbm:/usr/local/apache/conf/goodsites# Match lowercased hostnamesRewriteMap lowercase int:tolower# Here we go:# 1) first lowercase the site name and strip off a :port suffixRewriteCond ${lowercase:%{HTTP_HOST}} ^([^:]*).*$# 2) next look it up in the map file.# "%1" refers to the previous regex.# If the result is "OK", proxy access is granted.RewriteCond ${ProxyAllow:%1DENY} !^OK$ [NC]# 3) Disallow proxy requests if the site was _not_ tagged "OK":RewriteRule ^proxy: - [F]...&/VirtualHost&Proxy DenyDescription:How can we forbid a certain host or even a user of a special host from using the Apache proxy?Solution:We first have to make sure mod_rewrite is below(!) mod_proxy in the Configuration file when compiling the Apache webserver. This way it gets called _before_ mod_proxy. Then we configure the following for a host-dependend deny...RewriteCond %{REMOTE_HOST} ^badhost\$ RewriteRule !//[^/.].* - [F]...and this one for a user@host-dependend deny:RewriteCond %{REMOTE_IDENT}@%{REMOTE_HOST} ^badguy@badhost\$RewriteRule !//[^/.].* - [F]Special Authentication VariantDescription:Sometimes a very special authentication is needed, for instance a authentication which checks for a set of explicitly configured users. Only these should receive access and without explicit prompting (which would occur when using the Basic Auth via mod_access).Solution:We use a list of rewrite conditions to exclude all except our friends:RewriteCond %{REMOTE_IDENT}@%{REMOTE_HOST} !^friend1@cl$ RewriteCond %{REMOTE_IDENT}@%{REMOTE_HOST} !^friend2@cl$ RewriteCond %{REMOTE_IDENT}@%{REMOTE_HOST} !^friend3@cl$ RewriteRule ^/~quux/only-for-friends/ - [F]Referer-based DeflectorDescription:How can we program a flexible URL Deflector which acts on the "Referer" HTTP header and can be configured with as many referring pages as we like?Solution:Use the following really tricky ruleset...RewriteMap deflector txt:/path/to/deflector.mapRewriteCond %{HTTP_REFERER} !=""RewriteCond ${deflector:%{HTTP_REFERER}} ^-$RewriteRule ^.* %{HTTP_REFERER} [R,L]RewriteCond %{HTTP_REFERER} !=""RewriteCond ${deflector:%{HTTP_REFERER}NOT-FOUND} !=NOT-FOUNDRewriteRule ^.* ${deflector:%{HTTP_REFERER}} [R,L]... in conjunction with a corresponding rewrite map:#### deflector.map## - - This automatically redirects the request back to the referring page (when "-" is used as the value in the map) or to a specific URL (when an URL is specified in the map as the second argument).OtherExternal Rewriting EngineDescription:A FAQ: How can we solve the FOO/BAR/QUUX/etc. problem? There seems no solution by the use of mod_rewrite...Solution:Use an external rewrite map, i.e. a program which acts like a rewrite map. It is run once on startup of Apache receives the requested URLs on STDIN and has to put the resulting (usually rewritten) URL on STDOUT (same order!).RewriteEngine onRewriteMap quux-map prg:/path/to/map.quux.plRewriteRule ^/~quux/(.*)$ /~quux/${quux-map:$1}#!/path/to/perl# disable buffered I/O which would lead # to deadloops for the Apache server$ = 1;# read URLs one per line from stdin and# generate substitution URL on stdoutwhile (&&) {s^foo/bar/;print $_;}This is a demonstration-only example and just rewrites all URLs /~quux/foo/... to /~quux/bar/.... Actually you can program whatever you like. But notice that while such maps can be used also by an average user, only the system administrator can define it.  
&&&主编推荐
&&&热门试卷
&&&最新视频
&&&热门阅读
&&&最新问答
&&&&&&&&&&&&&&&
希赛网 版权所有 & &&&&湘教QS2-164&&增值电信业务经营许可证湘B2-}

我要回帖

更多关于 什么是url重定向 的文章

更多推荐

版权声明:文章内容来源于网络,版权归原作者所有,如有侵权请点击这里与我们联系,我们将及时删除。

点击添加站长微信