Forum Replies Created

Viewing 15 replies - 1 through 15 (of 43 total)
  • Bummer. But thank you for keeping on this! 🙂

    I’d like to soften my tone a bit, and offer some positivity. I do think the Element Pack widgets are useful. They do what they are supposed to do, and they look nice.

    I liked it enough to want to support your work by investing in the commercial version. For that, I was super enthusiastic about your business model (a one time purchase and you’re done). There are at least 5 other WP plugins that I would gladly pay for, but they all require subscriptions and I just won’t do that. I’m sure I’m not alone, I think a lot of WP developers are leaving tons of money on the table. Adobe is finding out the hard way that the subscription model is failing.

    Look, my only complaint about Element Pack is that it loads everywhere.

    [redacted — for the same reason]

    Elementor (the core) and Elementor Pro are, IMO, a necessary evil. Luckily, load times have massively improved recently. So no, I won’t be killing that.

    WooCommerce does not load everywhere, but you are right that the form plugin CF7 *does* load everywhere… extra annoying if you integrate Google Captcha because that gets loaded everywhere too. To deal with that, I’m using Asset CleanUp to kill it everywhere except where needed.

    I would do the same with Element Pack, except I’ve been using it so long I can’t remember where I need it and where I don’t. If I had realized it would bog down every page, just so I could use a widget or two on a few pages… no way I’d have ever started using it.

    Do I have other intentions? That’s funny.

    Hate to admit it, but I rarely take the time to review plugins. I either have to really love it, or really hate it. I was shocked when I discovered how intertwined your plugin was, so I reached out for support (as a Pro user) looking for a way around it.

    Your response was super slow, but I was patient. When it finally did come, it was a random answer, probably intended for some other person. I replied to you, pointing out I had received someone else’s answer… but received nothing further.

    To get your attention, I posted my review.

    And what do ya know, it worked! I finally have an on-subject response from you:

    Anyway, the good news is we are working on optimizing assets loading so you can select the loading behavior as you wish. stay with us. The next major release will be coming soon.

    Had you said that in your response to my enquiry, I would have had no need to post a review.

    But I’m sure I would have followed up with these questions:
    1. Will Element Pack still automatically load everywhere? Even when not in use?
    2. While we be able to choose <body> instead of <header>?
    3. Will we be able to choose a “deferred” instead of render blocking?
    4. When-ish do you think the next release will be ready?

    [redacted]

    Changing the code worked for me also, thanks!

    For anybody else trying to do this, here is where the file lives:
    /wp-content/plugins/elementor/core/editor/editor.php

    DUMB QUESTION: When the Elementor plugin updates, this will be overwritten correct?

    You’re welcome, @frankmarks.

    Personally, I think XML sitemaps are overrated. Bots will crawl your site, hitting every page they find a link to except those that are forbidden. One great thing about XML sitemaps is they help bots find pages that might not be linked to directly from some other mappable page on your site.

    There is a very simple workaround. Uninstall the XML Sitemap and just add a physical sitemap link to your footer (so that it is easily discoverable and regarded as important, since it appears in footer). You can build that page yourself very easily, and control every link on there. Then put up your own robots.txt and call it a day.

    Good luck!

    Hi @vmarko,

    Ah, it’s comforting to hear that this is “normal”.

    I wonder about my settings, though. I’m on my own VPS, well provisioned and very fast. I decided to speed up my cache preloads.

    I have it set to:
    Automatically prime the page cache
    Update interval: 10 Seconds
    Pages per interval: 15

    Is that acceptable?

    • This reply was modified 3 months, 2 weeks ago by Feznizzle.

    Done! Thank you!

    Hi @vmarko,

    Thank you so much for the explanation. I will keep zlib disabled as advised.

    Thanks!

    No worries! And now I’m excited about your next release!

    FYI, the Google Sitemap XML plugin does NOT obey the physical robots.txt. You can still put a physical robots.txt to prevent indexing, but you will have to manually exclude posts/pages that you do not want indexed. Otherwise you will kick Google Console errors.

    Ok, I think I have found the solution!!!

    I am putting this up to help anybody else who comes down this path, and for my own future reference (as I don’t currently have time to implement).

    The solution is quite simple: Override the virtual robots.txt by creating a physical one!

    Before you do that, check what is currently in your virtual file by browsing to it:
    http://www.yoursite.com/robots.txt

    In my case, the only thing there is:
    User-agent: *
    Disallow: /wp-admin/
    Allow: /wp-admin/admin-ajax.php
    Sitemap: (a url to my site)
    Sitemap: (a url to my site)

    After copying that down, I will go disable the Google Sitemap XML option to “Add sitemap URL to the virtual robots.txt file.”

    Then I will recreate the robots.txt, removing “Allow: /wp-admin/admin-ajax.php”.

    Now to prevent indexing, all I have to do is find a commonality and add a disallow. In my case, all of my advertisement landing page URLs look like this:
    mysite.com/fb-ad/001
    mysite.com/fb-ad/002
    mysite.com/fb-ad/003

    So I will add this disallow:
    Disallow: /fb-ad/

    Yaaaaaaaaahooooooooo!

    A WORD OF CAUTION: It is possible that the plugin does not obey the robots.txt, I haven’t tested this yet. If it doesn’t, then Google Sitemap XML will still be adding those pages to the sitemap it generates.

    And that will cause conflict (console errors). I believe this to be highly unlikely (again, untested as yet), because it seems super basic to me that this plugin would first consult the robots.txt before creating a sitemap.

    However, if the plugin does continue adding unwanted pages to the map then you will have to remember to *manually* add pages you don’t want indexed to the list via the builtin solution I described above.

    I will try to remember to update this thread, after I have implemented this on the site I am currently working on. If you test this out before I get to it, please share your results.

    Good luck, y’all!

    No need to post almost empty replies just to follow a topic. There’s a link in the sidebar “Subscribe” for that exact purpose.

    Ah, thanks for the tip!

    So there is a solution built in, though it’s a bit of a pain. You can submit a list of post and page numbers, separated by commas.

    To find the page or post #, in admin go to your list of pages or posts. Hover over the edit button and you will see the associated number in the URL. It will say something like, “https://your-website.com/wp-admin/post.php?post=3390&action=edit”.

    That being said, I don’t think simply removing pages from the sitemap will prevent them from showing up on google because bots will follow any links it finds. A better solution would be to put a noindex in the header tags of pages you don’t want to show up (<meta name=”robots” content=”noindex”>).

    Or you could use robots.txt to list off pages you don’t want indexed (you could use regex there).

    My problem is that my robot.txt has gone missing! It is absolutely there, I verified via google Console and I can browse to it (my-site.com/robots.txt). However, I cannot find this actual file in my public_html folder via my FTP client nor directly through cPanel File Manager!

    I suspect it has been moved and .htaccess is being used to redirect, but I can’t figure it out for the life of me!

    Tagging myself in because I’m looking for the same solution. Alternatively, I’d like to know how to exclude specific static pages.

    Why? Because I create landing pages for ads that run on FaceBook, Google, etc. I don’t want these pages indexed.

    My initial thought was to use robot.txt to add no follows, but I see two problems with that. First, since this mod is now controlling robot.txt will my directives be overwritten? Second, even if I exclude via robot.txt this mod will subsequently tell bots to index those pages by including them on the sitemap.

    Feznizzle

    (@feznizzle)

    Sorry for the slow response but… awesome, glad it worked out!

    I’m off to write a review for you! 🙂

Viewing 15 replies - 1 through 15 (of 43 total)