WordPress.org

Support

Support » Plugins and Hacks » [Resolved] [Plugin: RePress] Proxied URLs seem to be not working

[Resolved] [Plugin: RePress] Proxied URLs seem to be not working

  • Hello,

    I use this plugin on my website.
    It proxies thepiratebay.org pretty well, but all the links on it seem to be linking to my own domain!
    So thepiratebay.org/browse goes to mydomain.com/browse instead of
    mydomain.com/repress/thepiratebay.org/browse

    what is happening here? and how can i fix this?
    maybe some URL rewrite problem?

    http://wordpress.org/extend/plugins/repress/

Viewing 15 replies - 1 through 15 (of 17 total)
  • Plugin Author Greenhost

    @greenhost

    Thnx for your feedback. I don;t have an answer right away, but we will try to resolve this issue in the next update.

    I’ve run into the same problem (and fixed it)
    He’s my modified proxy.php file http://pastebin.com/p8Erciv9
    Hope this helps!)
    mh

    Hello Mhume,

    Thnx for fixing the script, but i can’t seem to use it.

    When i replace the old proxy.php with this code i get a white page when i try to use a link on repress. So for me it doesn’t work cause i don’t see the proxied webpage 🙁

    Any ideas?

    I’ve done the same thing that freeouri and found the same problem. White page after trying your fixed version of proxy.php.

    Can someone please help me/us with this???

    @ greenhost where is our update??

    Plugin Author emile_all4xs

    @emile_all4xs

    Hello, could those people who are experiencing this problem please post the url of their proxy to this forum, or mail it to repress@all4xs.net

    Make sure you have the most recent version of the plugin, and also try to test with both URL obfuscation ‘on’ and ‘off’.

    Also if there are any more relevant details about your hosting environment (mod_php vs. fastcgi, shared vs. dedicated) it might be helpfull.

    Thank you and hope to be able to fix this!

    Hi guys
    My previous post was for an earlier version of the plugin.
    I’ve moved my edits for proxy.php to the latest version of the plugin (0.1alpha14) and stuck it up on paste bin.
    It can be found here http://pastebin.com/hHP8TGj1

    emile_all4xs
    I’m just about to drop a mail to repress@all4xs.net referencing this thread with as many details as i can include. If you search the post on pastebin referenced above for mh_edit you’ll see where i have amended the code
    This all seems to come from a “bug” i seem to get on my server where the responce headers are missing until you read some data from the connection https://bugs.php.net/bug.php?id=46896

    Hope this helps!
    mh

    PS would post all requested data here but a little unsecure!)

    Plugin Author emile_all4xs

    @emile_all4xs

    Thank you. I have just committed this patch to 0.1alpha15

    Plugin Author emile_all4xs

    @emile_all4xs

    Actually. I meant your first patch. I am looking at this next one right now.

    emile_allxs
    You’re welcome!) Sharing is caring

    For this problem consider the php

    error_reporting(E_ALL);
    $streamOptions = array('http'=>array('method'=>"GET",'host'=>'google.com'));
    $context = stream_context_create($streamOptions);
    $handle = fopen('http://google.com', "rb", false, $context);
    $meta = stream_get_meta_data($handle);
    echo '<pre><strong>$meta BEFORE read of one byte</strong><br>';
    print_r($meta);
    $one_byte = fread($handle, 1); // mh_edit this is used to stop the "bug" reported here https://bugs.php.net/bug.php?id=46896 where the responce headers are missing
    $meta = stream_get_meta_data($handle);
    echo '<br><strong>$meta AFTER read of one byte</strong><br>';
    print_r($meta);
    die;

    The output i get follows (some header content redacted from the second display of $meta)

    $meta BEFORE read of one byte
    Array
    (
        [wrapper_data] => Array
            (
                [headers] => Array
                    (
                    )
    
                [readbuf] => Resource id #3
            )
    
        [wrapper_type] => cURL
        [stream_type] => cURL
        [mode] => rb
        [unread_bytes] => 0
        [seekable] =>
        [uri] => http://google.com
        [timed_out] =>
        [blocked] => 1
        [eof] =>
    )
    
    $meta AFTER read of one byte
    Array
    (
        [wrapper_data] => Array
            (
                [headers] => Array
                    (
                        [0] => HTTP/1.1 301 Moved Permanently
                        [1] => Location: http://www.google.com/
                        [2] => Content-Type: text/html; charset=UTF-8
                        [3] => Date: Thu, 10 May 2012 14:53:37 GMT
                        [4] => Expires: Sat, 09 Jun 2012 14:53:37 GMT
                        [5] => Cache-Control: public, max-age=2592000
                        [6] => Server: gws
                        [7] => Content-Length: 219
                        [8] => X-XSS-Protection: 1; mode=block
                        [9] => X-Frame-Options: SAMEORIGIN
                        [10] => HTTP/1.1 302 Found
                        [11] => Location: http://www.google.co.uk/
                        [12] => Cache-Control: private
                        [13] => Content-Type: text/html; charset=UTF-8
                        ... (data redacted here)
                        [33] => Transfer-Encoding: chunked
                    )
    
                [readbuf] => Resource id #3
            )
    
        [wrapper_type] => cURL
        [stream_type] => cURL
        [mode] => rb
        [unread_bytes] => 4095
        [seekable] =>
        [uri] => http://google.com
        [timed_out] =>
        [blocked] => 1
        [eof] =>
    )

    As you can see the header data is in an array [headers] further down than your code expects and without the 1 bite read there is no header data at all. I can only assume its something to do with the version of php or something to do with the operating system.

    emile_all4xs, could you post your results from running the above code?

    Could another user who sees this problem implement my fix and leave a response here to let us know if it works for others?

    Hope this helps
    mh

    Hello emile_allxs,

    My RePress URL is: http://www.carloscapote.com/blog/repress/

    After trying to open, only one, http://piratebay.cc, my log (the one generated by logline function in proxy.php was:

    [Log data moderated as per the Forum Rules. The maximum number of lines of code that you can post in these forums is ten lines. Please use the pastebin]

    In my server, the result of the “test” proposed by mhume was the same (no header until one byte was read):

    [Log data moderated as per the Forum Rules. The maximum number of lines of code that you can post in these forums is ten lines. Please use the pastebin]

    As aditional info, my site is in a shared hosting and my PHP version is 5.3.8.

    Thanks a lot

    And here’s a link to pastebin for the modified proxy.php file for 0.1alpha15 http://pastebin.com/Pq34R3J6
    mh

    It works! It’s working in my blog, with @mhume‘s patch. Thanks!

    I’m running a WordPress 3.3.2.

    P.S.: I’ve been testing and I found some minor issues using obfuscated urls, but they’re probably not related with the patch. I will try to test it a little bit more and send you some feedback.

    Hi
    On testing some more of the servers under my control i’ve managed to see both types of results. i.e. those that need a read of at least 1 byte of data and those that don’t (to display header info)

    There are many differences between the environments on these servers but the one that seems to introduce this issue is when PHP is compiled with --with-curlwrappers

    Consider the code
    phpinfo(INFO_GENERAL);

    Search the resulting webpage for “curl” if you find --with-curlwrappers listed in the Configure Command section of phpinfo then you’ll also see the problem that this thread is about.

    Consider the following code

    $streamOptions = array('http'=>array('method'=>"GET",'host'=>'google.com'));
    $context = stream_context_create($streamOptions);
    $handle = fopen('http://google.com', "rb", false, $context);
    $meta = stream_get_meta_data($handle);
    echo '<pre><strong>stream_get_meta_data</strong><br>';
    print_r($meta);
    die;

    On a server WITHOUT php compiled with --with-curlwrappers i.e. what the developers of the plugin expect

    stream_get_meta_data
    Array
    (
        [wrapper_data] => Array
            (
                [0] => HTTP/1.0 301 Moved Permanently
                ... (data redacted here)
                [32] => X-Frame-Options: SAMEORIGIN
            )
    
        [wrapper_type] => http
        [stream_type] => tcp_socket/ssl
        [mode] => r+
        [unread_bytes] => 684
        [seekable] =>
        [uri] => http://google.com
        [timed_out] =>
        [blocked] => 1
        [eof] =>
    )

    On a server WITH php compiled with --with-curlwrappers

    stream_get_meta_data
    Array
    (
        [wrapper_data] => Array
            (
                [headers] => Array
                    (
                    )
    
                [readbuf] => Resource id #3
            )
    
        [wrapper_type] => cURL
        [stream_type] => cURL
        [mode] => rb
        [unread_bytes] => 0
        [seekable] =>
        [uri] => http://google.com
        [timed_out] =>
        [blocked] => 1
        [eof] =>
    )

    Note the differences in wrapper_type and stream_type
    The wrapper_type is the important one as to the format of the returned wrapper_data

    Ok Devs
    I’ve given you a fix (admittedly not the prettiest code) and now i’ve given you the info on why it happens.
    If i tidy my code and optimise it for both configurations do you think we can get it added to the source?
    mh

    Ok guys and gals (and the devs)

    Here’s a modified proxy.php file for 0.1alpha15
    http://pastebin.com/MVBjNMLj

    This version checks what wrapper the requests are being sent with and performs any necessary code changes (keeping the overhead down for those without the curl wrapper).
    If a change of wrapper is detected its stored, the user is notified of the change and prompted to refresh their browser.
    mh 😉

Viewing 15 replies - 1 through 15 (of 17 total)
  • The topic ‘[Resolved] [Plugin: RePress] Proxied URLs seem to be not working’ is closed to new replies.