APS crawler: String could not be parsed as XML

Discussion in 'Installation/Configuration' started by VANKO, Mar 19, 2015.

  1. VANKO

    VANKO New Member

    OS: Ubuntu 14.04
    ISP: 3.0.4.3p5

    Hi, im trying to update packagelist but i cant becouse of error: String could not be parsed as XML.
    How can i solve it?
     
  2. florian030

    florian030 Well-Known Member HowtoForge Supporter

    Update interface/lib/classes/aps_crawler.inc.php link
    and server/lib/classes/aps_installer.inc.php link
     
  3. VANKO

    VANKO New Member

    thanks for reply, but still not working, i dont know why.
     
  4. RHITNL

    RHITNL Member

    Hi,

    I have the exact same issue. Running Centos 6 Nginx setup. (installed via the guide without any problem)
    The files are downloaded with to folder /usr/local/ispconfig/interface/web/sites/aps_meta_packages.
    They are zipped but contain nothing more then PKG_URL.
    I used the patch but it still stops the script before adding the values to the database.
    Database: dbispconfig - Table: aps_packages is still completely empty.
    The nginx log shows the following which I think is related.
    Code:
    2015/03/21 23:54:21 [error] 9651#0: *95 FastCGI sent in stderr: "PHP message: PHP Warning:  SimpleXMLElement::__construct(): Entity: line 1: parser error : StartTag: invalid element name in /usr/local/ispconfig/interface/lib/classes/aps_crawler.inc.php on line 288
    PHP message: PHP Warning:  SimpleXMLElement::__construct(): < in /usr/local/ispconfig/interface/lib/classes/aps_crawler.inc.php on line 288
    PHP message: PHP Warning:  SimpleXMLElement::__construct():  ^ in /usr/local/ispconfig/interface/lib/classes/aps_crawler.inc.php on line 288" while reading response header from upstream, client: XXX.XXX.XXX.XXX, server: _, request: "GET /sites/aps_cron_apscrawler_if.php HTTP/1.1", upstream: "fastcgi://unix:/var/lib/php5-fpm/ispconfig.sock:", host: "domain.tld:8080", referrer: "https://domain.tld:8080/index.php"
    The system log set to debug show
    Code:
    [INTERFACE]: APS crawler: Cannot read metadata from LePaySys-1.0-1.app.zip
    for each downloaded file, in my case a total of 83.
    Been spending several hours on this but I'm unable to solve.

    Thank for the great panel. Hopefully you can point me in the right direction to make it even more useful

    Regards,

    Ralph
     
  5. veshant

    veshant New Member

    I am having the same issue as RHITNL. Been trying to solve it since yesterday.
     
  6. till

    till Super Moderator Staff Member ISPConfig Developer

    The probem is that your server is not able to download the aps packe from apsstandard.org by https. This can have many causes e.g. a firewall in front of your server that blocks the download. Another possibility is that your server is not able to validate the ssl cert and therfor denies the download, please install the 2 patched files that Florian pointed out as they disable the ssl verification for the downlaod. This problem appears onyl on Linux distributions that use very old software versiosn like centos 6 specific as it cant work with the stronger ssl settings that apsstandard.org uses now.
     
  7. veshant

    veshant New Member

    It's still logging the same error after applying Florian's patch. I'm running on Ubuntu 14.04 so I think the ssl should be fine. Could there be any other causes?
     
  8. RHITNL

    RHITNL Member

    Thanks for the reply but both patches has been installed already. Added the extra lines in both files. If the server can not download, where do the files (zip) come from? I would thing even that part will fail.
    Since I have ssl enabled as well with a selfsigned certificate I disabled ssl on server site as well to be sure the communication to my server is working.
    This is a cloud vps server and the only firewall is in the ispconfig packet and disabled now. I would like to use centos 7 but the tutorial you've written does not work on a vps like mine. The disk setup is different and therefore I can not set quota like it should. RHEL is in my opinion the best webserver you can use and never had any issues with any software so far.
    Any other thoughts? Something I can update or replace? My test server at home is also same setup but I will give debian a shot.
     
  9. till

    till Super Moderator Staff Member ISPConfig Developer

    The plugin uses a curl multi download. When a curl multi download fails for whatever reason, it creates a empty file.

    The guide works for all kind of servers and VPS, I use it daily to install ISPConfig for clients all around the world. If your hoster supports any kind of Linux quota, then it will work in ISPConfig. If your hoster does not support quota at all, then ispconfig will work as wll, you just cant limit the website size without a filesystem quota. RHEL is nice as long as you dont need any additional software as it lacks so many packages. As soon as you have to start to add several third party repositories because redhat does not provide all required packages, then you will find yourself with a quite unstable system. The benefit of Debian and Ubuntu is that they have a much larger package base in a central repository, this ensures that all packages are tested against each other and you dont get the package conflicts like on Centos or RHEL with their third party repos. Another benefit from Debian is that you can do live dist upgrades for years now, I have servers that were live updated from Debian 3.1 to 7. Try to do a Centos 4 to 7 upgrade while the server is running.
     
  10. HouseNationNL

    HouseNationNL New Member

    Having the same issues as RHITNL.

    The .zip files aren't really empty, it is shown as a directory and contains a file with the name PKG_URL

    When I open the PKG_URL in for example Yourls-1.5-4.app.zip:

    Code:
    root@hosting aps_meta_packages]# cd Yourls-1.5-4.app.zip/
    [root@hosting Yourls-1.5-4.app.zip]# ls
    PKG_URL
    [root@hosting Yourls-1.5-4.app.zip]# cat PKG_URL
    [URL]https://apscatalog.com/1/yourls.org/Yourls/1.5-4.aps?arch=undefined&packager=www.softec-internet.com&os=undefined&platform=undefined[/URL]
     
  11. till

    till Super Moderator Staff Member ISPConfig Developer

    These are no zip files, they are folders and the file that you see inside of the folder gets added by ispconfig independantly of the download. So it has not been downloaded from somewhere. Like i pointed out, the issue is that the download by curl fails completely.

    It is also possible that apsstandard.org has technical issues or the changed their ssl settings again so that the curl options have to be adjusted. I will check that tomorrow.
     
    Last edited: Mar 22, 2015
  12. HouseNationNL

    HouseNationNL New Member

    Ok, but I really don't have a clue where to look now. Already applied the patches to turn off SSL, but without any results.

    Running:
    CentOS release 6.6 (Final)

    Tried a clean install on CentOS 7 but results in the same issue.
     
  13. till

    till Super Moderator Staff Member ISPConfig Developer

    I'll try to find the time to check that tomorrow.

    It is possible that apsstandard.org has technical issues or the changed their ssl settings again so that the curl options have to be adjusted.
     
  14. HouseNationNL

    HouseNationNL New Member

    ok, thanks in advance!
     
  15. RHITNL

    RHITNL Member

    The URL HouseNationNL mentioned inside the PKG, also works on http instead of https.
    I know it is less secure but in what file is set the aps-crawler needs to use https?
     
  16. veshant

    veshant New Member

    When I curl from the command line using the https url, I get an error: Exit 60, which relates to ssl. I tried using the -k parameter and it works fine. In that case the earlier patch should work as its equivelent to CURLOPT_SSL_VERIFYPEER. But it doesn't? I also tried curl with the http address, it worked without specifying any parameters. I noticed that in the aps_crawler.inc.php file, it uses the http address. So I don't see why it doesn't work.
     
  17. emcee

    emcee New Member

    Hello, I have the same issue with aps installer, packages won't update, available packages 0. I'm using Debian Wheezy + ISPConfig (setup as here: https://www.howtoforge.com/how-to-r...and-secondary-with-ispconfig-3-debian-squeeze)

    Everything is working excellent, DNS servers and all. APS however not. I have tried everything I found in this forums and elsewehere but to no avail.
    Apache err shows this:
    ------------------------------------------
    [Mon Mar 23 15:00:54 2015] [warn] [client 84....] mod_fcgid: stderr: PHP Warning: SimpleXMLElement::__construct(): ^ in /usr/local/ispconfig/interface/lib/classes/aps_crawler.inc.php on line 289, referer: https://o......com:8080/index.php#
    -------------------------------------------------------------
    ...on line 289 is this: $sxe = new SimpleXMLElement($xml);

    I'm stuck. Please help.
     
  18. till

    till Super Moderator Staff Member ISPConfig Developer

    I had a similar issues today with a different script that uses php curl. I had to diable also CURLOPT_SSL_VERIFYHOST to get it working. Maybe you can try to add:

    curl_setopt($conn[$i], CURLOPT_SSL_VERIFYHOST, 0);

    right below the line:

    curl_setopt($conn[$i], CURLOPT_SSL_VERIFYPEER, 0);

    in both files.
     
  19. Steinbruch

    Steinbruch Member HowtoForge Supporter

    Hi,
    I came here because my ISPconfig setup hit the same issue today - just tried the change suggested by Till but it doesn't fix it - same error as described above keeps showing up.
    Has me slightly stranded there as I was hoping to install an updated package that is now not even listed anymore...
    Thanks in advance for any pointers towards a fix.
    - Jerry
     
  20. till

    till Super Moderator Staff Member ISPConfig Developer

    Did you first install the new files that Florian pointed out and then added the additional fix, so that both files have now the lines:

    curl_setopt($conn[$i], CURLOPT_SSL_VERIFYHOST, 0);
    curl_setopt($conn[$i], CURLOPT_SSL_VERIFYPEER, 0);
     

Share This Page