logo topseosmo.com

@Barnes591

Barnes591

Last seen: Tue 11 May, 2021

Signature:

Recent posts

 query : Putting Google+ and Facebook reviews on my website with Schema.org I have a some reviews on my Facebook page and Google Plus page. Can I put those reviews on other 3rd party review page using

@Barnes591

Posted in: #DuplicateContent #SchemaOrg #Seo #UserReviews

I have a some reviews on my Facebook page and Google Plus page. Can I put those reviews on other 3rd party review page using schema?

Will it be counted as duplicate content by Google? Is it right for SEO?

10.06% popularity Vote Up Vote Down


Report

 query : How to avoid SSL warnings about certificate not for the current site when redirecting alternate domains? I have multiple domains forward issue. Domains with plurals and singular e.g.: example.com,

@Barnes591

Posted in: #301Redirect #Http #Https #Redirects #SecurityCertificate

I have multiple domains forward issue. Domains with plurals and singular e.g.:


example.com,
examples.com,
example.biz,
examples.biz


I forwarded all of them to examples.com and www.examples.com but ran into web browser warnings about someone could intercept communications as the SSL cert issued is not for the current website I visited. I have SSL cert for examples.com and www.examples.com
What is the best solution for this?

10.01% popularity Vote Up Vote Down


Report

 query : Impact of using .resx files to store multilingual content on the S.E.O Could anyone tell me whether google crawlers are capable of reading through the contents of a multilingual website which

@Barnes591

Posted in: #AspNet #Seo

Could anyone tell me whether google crawlers are capable of reading through the contents of a multilingual website which uses .resx files to dynamically switch languages, and if using resx have any impact on the SEO?
The way the project is organized is:

-it has a default culture hard coded into a main page and based on that ASP is serving the content from a default language .resx file.

-if the default culture of the project differs from the preferred users culture, then said culture is exchanged for the preferred users and the resources are drawn from appropriate .resx

-App_LocalResources contains resx with all the languages for each page separately as in mainpage.aspx.de.resx/mainpage.aspx.fr.resx anotherpage.aspx.de.resx/anotherpage.aspx.it.resx/anotherpage.aspx.resx (last one for default language) and so on

10.01% popularity Vote Up Vote Down


Report

 query : Re: SEO - switching domains on an established site to a brand new domain I made a new site for a client that is a huge improvement in terms of asset optimisation, page loads, on-page SEO etc.

@Barnes591

How to SEO-relaunch:


Make a Link-DeTox Audit, where you will find hurting backlinks, pointing to the old domain.
Clean the backlinks (Disavow)
Both Versions should be live
Implement 1:1 redirects, or topical relevant redirects - if possible all URLs.
Wait some days, check "site:olddomain.tld" from time to time, are all redirects correctly redirecting?
The pages of your old domain should disappear slowly in the G-index


I would not suggest to keep the old domain as lead collector. This can be irritating for customers and you will waste your linkjuice/seopower/however you name it.

And regarding backlink-strategy:
Do not do any shitlinking cheap stuff. Make some healthy, topic-related content on other sites. IF you want to keep your customer - quality counts!

10% popularity Vote Up Vote Down


Report

 query : Re: Is Google analytics tracking all of my traffic? on all accessible variations/properties I am wondering is Google analytics track all of my properties? https://www.example.com (SSL + www) https://example.com

@Barnes591

If the same tracking code is embedded on all these URLs - yes.
It should set cookies on all subdomains.
Regardless if SSL or Non-SSL.

But i would suggest not to use a specific cookie domain if not reeeeally necessary. And it looks like "not necessary" in your case :-)

<script type="text/javascript" >
window.ga=window.ga||function(){(ga.q=ga.q||[]).push(arguments)};ga.l=+new Date;
ga('create', 'UA-1234567891-0', 'auto');
// Plugins
ga('require', 'linkid', 'linkid.js');ga('require', 'outboundLinkTracker');
ga('send', 'pageview');
</script>

10% popularity Vote Up Vote Down


Report

 query : HTTP website returns 403.4 even when using Plesk Permanent SEO-safe 301 redirect from HTTP to HTTPS My domain has a valid and activated "Let's Encrypt!" certificate, and going to https://example.com

@Barnes591

Posted in: #301Redirect #Https #Iis #Plesk

My domain has a valid and activated "Let's Encrypt!" certificate, and going to example.com works as it should and shows as being secure in the browser.

Through Plesk Onyx 17.5.3 I have activated the option Permanent SEO-safe 301 redirect from HTTP to HTTPS in Hosting Settings.

But when going to example.com, it shows a 403.4 Forbidden error and that I should use the https version of the website instead of redirecting to it.

10.01% popularity Vote Up Vote Down


Report

 query : Re: Is using Facebook advertising's "buyers" target a good way to lure likely shoppers? I need to advertise a website to sell products related to Saint Seiya, so I decided to begin with Facebook

@Barnes591

There are two options:


You make a better targeting to people who have buying intention.
You make a better sales site which converts better.


Also you could read into facebooks custom audiences feature for ads, there is a lot of fun stuff you can do with that ;-)

10% popularity Vote Up Vote Down


Report

 query : Re: HTTP-header canonical and link tag canonical on same page? I've come across websites using both a HTTP-header based canonical as well as a tag-based canonical. Sometimes they are not referencing

@Barnes591

If they do not link to the same there will be trouble for the good old g-bot. He does not know what to do then with those pages. The outcome will be less or wrong indexed pages. Which will lead to loss of rankings.

Normally such stuff will be shown in the search console messages if something like this is happening.

Same with hreflang - different/wrong setup = google bot gets confused and does "stupid stuff".

10% popularity Vote Up Vote Down


Report

 query : Re: How do you generate SEO pages that match a large number of generic search phrases and locations? I see that some sites in the Google search results have a matching page for almost any combination

@Barnes591

These pages are "pre-generated". The "dynamic computing" of pages would not be possible, as Google has to know and "rate" the page before giving it a (hopefully good) place within the serps.

10% popularity Vote Up Vote Down


Report

 query : Re: Is is OK to back up a WordPress installation while it is serving content? Running a LEMP stack on Ubuntu 16.04. Found plenty of great documentation for backing up WordPress: Export your database:

@Barnes591

It should be completely safe to backup WordPress during normal operations. In fact, some of the best backups available for WordPress run live on the instance and do minute to minute backups of all changes (VaultPress and similar). Invoked backups will usually just copy the WordPress database tables and also copy the wp-content folder to the indicated destination. The only thing you may notice is some performance issues on slow/shared servers but you shouldn’t see any issues on VPS or properly configured dedicated.

tl;dr Yes, it’s safe.

10% popularity Vote Up Vote Down


Report

 query : Stop Googlebot accessing externally cached dynamic CSS/JS files Whilst analysing the server log files using Screaming Frog Log File Analysis I have noticed that Googlebot is accessing dynamic CSS

@Barnes591

Posted in: #Seo #SeoAudit

Whilst analysing the server log files using Screaming Frog Log File Analysis I have noticed that Googlebot is accessing dynamic CSS and JS files that return a HTTP status of 404.

The site is on Wordpress and uses the Automizer plugin.

They purge the cache every time they change the files so I think the non existing CSS and JS files are ones that Googlebot has cached.

Is there anyway to prevent Googlebot from recrawling these files once they 404?
Would it be best to automatically create a 301 redirect when the CSS/JS files change?

10% popularity Vote Up Vote Down


Report

 query : How to limit impressions for an order in DFP for Small Business? I want an ad to have a fixed limit of impressions, say 50,000. I can't find an option to limit that in DoubleClick for Small

@Barnes591

Posted in: #GoogleDfp #GoogleDfpSmallBusiness

I want an ad to have a fixed limit of impressions, say 50,000. I can't find an option to limit that in DoubleClick for Small Business.

Labels

I've found an article instructing to add limitations with labels but they don't exist in the menu. It may only be available in the premium version.

10% popularity Vote Up Vote Down


Report

 query : Re: Can redirecting away from WordPress.com with a "Poor Man's Redirect" preserve SEO? My situation is the following: I have a WordPress free blog that I want to migrate to a static HTML site.

@Barnes591

No, my solution (the Poor Man’s Redirect) would not preserve SEO. By destroying the content on the WordPress.com site, you remove it’s relevancy to search engines. The content on the external site would become the indexed content and might get a small boost from link juice, but that’s it.

10% popularity Vote Up Vote Down


Report

 query : Soft 404 errors cause ranking drop or penalty? I own a free press release website with over 15k articles. i am in the process of cleaning up the site for spam and have removed over 100 pages

@Barnes591

Posted in: #GoogleSearch #Penalty #Soft404

I own a free press release website with over 15k articles. i am in the process of cleaning up the site for spam and have removed over 100 pages but it causes soft 404 errors in Google webmasters console.

I get about 100 uniques every day from Google as organic traffic, recently the site was hit by algorithmic penalty.

Does lots of Soft 404 errors cause Google penalty or rankings drop?

10.01% popularity Vote Up Vote Down


Report

 query : Re: What's the difference between WordPress hosting and other hosting plans? I aim to use WordPress to build an e-commerce website, however it's not clear if the hosting plans offer genuine difference

@Barnes591

While DocRoot is correct in that sometimes the only difference is marketing, in general there are some concrete improvements that most hosts make to the WordPress and server environment) that are used for these plans.

On the low end (DreamPress, GoDaddy, 1&1, et al) you will likely be placed on a VPS that has Varnish configured to run somewhat optimally with WordPress. The “managed” aspect usually means that the server is monitored for performance and the host makes sure the core is updated (which is less important that it used to be since it became a default). If a hosted site slows down or begins consuming too many resources, the host usually will notify you and/or take steps to reduce the server load.

The next tier up from these basic plans usually includes some form of security assistance and preventative features and some will also claim to have a better database server (exactly how is hard to quantify). For example, I believe GoDaddy struck a deal with Sucuri not too long ago to integrate Sucuri’s firewall and server checks into the GoDaddy WordPress hosting plans. Dreamhost use to use StopTheHacker but I haven’t seen mention of it recently so I don’t know what went on there. Other extra features would include some form of basic CDN or valet support for the site.

Beyond these basic plans are the Dedicated WordPress Hosts, such as WP Engine, Page.ly or WordPress’s own VIP service. These services are roughly similar to the basic ones but tend to have more features (staging environments, better security, backups, etc.), way better support and a much more thorough approach to optimizing and securing WordPress and will also cost a ton more. WP Engine is famous for curating plugins and themes that have known performance and/or security issues and won’t even let you install those on their site. Whenever you attempt to install or update something, they prompt you to create a snapshot backup for rollback purposes and have live chat and phone support if something goes sideways. That’s the kind of service you get for the upper end tiers.

Are these things any better than a “standard” *nix hosting plan? Depends. If you have shell access to your server and enough permissions to install your own stuff or change how PHP is deployed AND you know what you’re doing with hardening WordPress, then the answer is “probably not” as you have the experience and expertise needed. If you don’t know how to manage the hosting environment to that level or don’t know/understand the security ecosystem around WordPress, then you probably should opt for whichever managed setup gives you the best bang for your buck.

For your particular case, you very well may want to explore the higher-end tiers of managed hosting as speed and security are more critical to e-commerce than other types of sites but it all depends on how much time and energy you want to dedicate to the environment instead of the site itself. If you can do both, great! Save yourself some money and get a good vanilla VPS or dedicated box. If you are better with WordPress than server stuff, go with managed hosting or a dedicated host.

10% popularity Vote Up Vote Down


Report

 query : How to get theatre cafe to appear in Google searches for "cafe"? This is a question about a businesses with two categories: Performing arts theatre (primary category) + cafe (secondary). The

@Barnes591

Posted in: #GoogleSearch #Seo

This is a question about a businesses with two categories: Performing arts theatre (primary category) + cafe (secondary).
The arts theatre performs well in theatre-related searches.
It has a cafe which is featured on the website, but "nearby" searches on Google for "cafe in town-name" don't even list it, and text search lists it below smaller but more single-minded cafes.
I thought about creating a new Google Business as the cafe at the same location, but then reviews would get split between the two.
So how can I tell Google to return it in searches for cafes?

10.02% popularity Vote Up Vote Down


Report

 query : Re: How to avoid hreflang return errors with URL that contain parameters I'm getting HREF no return errors in search console and I don't understand why. We have a page that contains buttons for

@Barnes591

The problem you have is caused by the combination of canonical and hreflang.

Your setup

Given your example: for the URL example.com/au/publications?count=50&page=4 you specify the following:

<link rel="canonical" href="https://example.com/au/publications">
<link rel="alternate" hreflang="en-US" href="https://example.com/us/publications">


What's going on?

This is how Google reads these information:


This URL is a 1:1 dupliate of example.com/au/publications and I shall not index it
but this URL also has an alternate language version for "en-US" available at example.com/us/publications

What does this mean to Google?


the canonical is not a directive Google may respect it
the hreflang is a very important information, Google has to validate it by checking the alternate URL for return tags


What you see in your Search Console is that Google is doing exactly this: It ignores your canonical, visits the defined alternate URL and checks whether it links back to the original URL (with parameters).

The specified alternate URL does not link back to the parameterized URL and therefore causes an error.

How to get rid of this mess?

First: try not to distract Google.


If a URL is not a canonical URL - meaning is has parameters and links to a clean URL via link rel="canonical" → do not serve hreflang information.
If a URL is a canonical URL- meaning it is the one you want to be indexed an rank in search results and it has alternate language versions → do serve hreflang information for each language.


Applied to your example


for the URL example.com/au/publications?count=50&page=4 you only specify the canonical link: <link rel="canonical" href="https://example.com/au/publications">
for the URL example.com/au/publications you specify


the self referential canonical: <link rel="canonical" href="https://example.com/au/publications" />
each language version: <link rel="alternate" hreflang="en-US" href="https://example.com/us/publications"> and <link rel="alternate" hreflang="en-AU" href="https://example.com/au/publications">



Now Google reads:

example.com/au/publications?count=50&page=4is a 1:1 duplicate of example.com/au/publications → I shall not index it example.com/au/publications targets "en-AU" language users and has an alternate version for "en-US" users available at example.com/us/publications

Last but not least: be aware to only serve hreflang information, when there are translations available. If there are no translations of a specific URL it does not need hreflang annotation.



Sources

Visit the Google Guide on Use hreflang for language and regional URLs.

For your specific problem have a look at the video and wait for 08:50

10% popularity Vote Up Vote Down


Report

 query : Re: HTTP to HTTPS redirect: How not to create an infinite loop I have a WordPress install on a subdomain: https://blog.example.com To enforce SSL I have the following redirects in my .htaccess: <IfModule

@Barnes591

Since you are using WordPress, you could always change the wp_options table's first two entries to your https domain.

Another way to achieve this is to add Site Home and WP Home in the wp-config.php file. Here is the link to the relevant codex.

10% popularity Vote Up Vote Down


Report

 query : How to get category wise search engine view I have recently published 2 websites, one is the main domain with WordPress another one with subdomain & php, I have uploaded both site sitemap

@Barnes591

Posted in: #Seo

I have recently published 2 websites, one is the main domain with WordPress another one with subdomain & php, I have uploaded both site sitemap to Google, Google also indexed all links. But when I search with my domain name it does shows up with a nice format. I have some schema data code added on sites. But not sure how to do it. Please help.

For example, if I search with "Flipkart" then full website shows up like this,


Right now I don't need the search box but the listed view. In WordPress I have Yoast.

For Php subdomain site I have meta codes like itemprop="url" & itemprop='name' under <head itemscope itemtype="http://schema.org/WebSite">

10% popularity Vote Up Vote Down


Report

 query : Re: Is it possible to submit set of dynamic pages to Google? Suppose that I have a page example.com/some-page?ending=er that finds all words that end with given ending. Is it possible to tell

@Barnes591

It's certainly possible to submit pages like that to Google but, as is pointed out in the comments, it's not at all guaranteed that they will show in the index or benefit you in any way and may even hurt you. What works in your favor is the existence of other "list pages" for searches like foods that start with x as well as sites that are used to help solve crossword puzzles.

So long as your endings are limited and somewhat navigable (e.g. a menu of the most common 2-3 letter endings for english words), I don't see why Google would completely ignore it. What Google probably won't do is enter the er into your site's search form and send users to it.

10% popularity Vote Up Vote Down


Report

 query : Re: What causes traffic from India to a set of URLs with a common path and a strange search term on a new Google Analytics setup? I added Google Analytics to my site a few weeks ago, and since

@Barnes591

You need to delete those directories and/or install an anti-malware plug-in on your site. My swim club's website kept crashing and we couldn't figure it out. Turns out there were a bunch of folders that had somehow been uploaded to our site. They appeared to contain music and video files. Since installing the anti-malware plug-in and deleting the offending directories, we haven't had any problems with our website. India is still the #1 source of traffic to our site, but all those visitors are getting 404 errors now. I've tried restricting visitors from India and a few other countries, but they're still getting through. Not sure how long it will take for them to get the point that their music and videos are no longer there.

10% popularity Vote Up Vote Down


Report

 query : Re: Should I ignore W3 CSS Validation Service? I just checked my site with W3 CSS Validation Service. It had more than 200 errors. Then I checked many sites including Stack OverFlow. Here is

@Barnes591

I would say it's good to have valid scripts. Having said that when you use third-party products it may be hard make it a valid one.

If you have a custom-designed solution then you can easily fix and make it fully valid.

So, i would say don't worry too much but if it is within your capacity to make it valid then go ahead and make one.

Modified:

As stated, it's good to have valid HTML and CSS as it makes it easier to manage. But if you are using a third-party system and if it is too hard to make it a valid one the do not worry as search engine reads incorrect markup fine.

Watch This:

10% popularity Vote Up Vote Down


Report

 query : Re: Google Search Console is reporting 404s of combined and duplicated page URLs I am trying to increase the traffic to my website but I am getting weird Crawler Errors from Google. Google is telling

@Barnes591

well you can provide google with a sitemap that would help them index your website better and try using canonical links,

and see your URL Parameters in google webmaster tools and you can remove those urls in Remove URLs in google webmaster tools.

10% popularity Vote Up Vote Down


Report

 query : Getting custom Wordpress pages with url parameters indexed by google I have a Wordpress site set up that connects to a third party db to display data (widgets). I have one template set up

@Barnes591

Posted in: #GoogleCustomSearch #GoogleSearch #GoogleSearchConsole #Wordpress

I have a Wordpress site set up that connects to a third party db to display data (widgets).

I have one template set up that will display a list of widgets: mywebsite.com/widgets
Clicking through on a particular widget will bring you to a page like this: mywebsite.com/widgets/widgets/widget?cat=promo&sku=10034
So the widget information is queried based on the url arguments supplied.

This works well, but I'm having trouble getting my Google Custom Search (or Google at all) to pick up on these pages. Ideally, I'd like it to index all the widgets listed on the .../widgets list.

In the Google Search Console, I can navigate to Crawl > URL Parameters and I see my widget parameters listed, with my total number of widgets. I click edit, and made sure to select that it specifies content, and that Googlebot should crawl every url.

So, based on that it seems Googlebot is aware of these pages, but I cannot get them to appear in search results. Is there anything I've missed?

10% popularity Vote Up Vote Down


Report

 query : Fair Use of the thumbnail version of thousands of images from a third party? I would like to include on my website the thumbnail version of thousands of images created by a third party, which

@Barnes591

Posted in: #Copyright #Images #Legal #Thumbnail

I would like to include on my website the thumbnail version of thousands of images created by a third party, which I am trying to contact to ask permission, with no success. I would link from the thumbnail to the original version, hosted in the webservers of this third party, and I would be always citing the original author.

The images are generic landscapes, related to the topic of my website (Africa). I created a thumbnail version, resizing the original dimensions (1024x682) to a square 150x150 pixel cropped thumbnail. These thumbnails, with part of the original image, would be hosted in my webserver.

The third party created millions of images, and I've selected only a part of the original dataset, only those images related to the topic of my website. The purpose is offer information about Africa to the users of my website, linking these original images.

I wondered if this action could be considered as a 'Fair Use' of the contents, in order to avoid legal issues with this third party. I've reading fairuse.stanford.edu/overview/fair-use/four-factors/, but I would like to know other point of views.

10% popularity Vote Up Vote Down


Report

 query : How can you get a thumbnail when you share a PDF on Facebook One of my clients writes cookbooks. Some time ago, he wrote an article about the origins of Boston Cream Pie, which he has on

@Barnes591

Posted in: #OpenGraphProtocol #Pdf

One of my clients writes cookbooks. Some time ago, he wrote an article about the origins of Boston Cream Pie, which he has on his site as a PDF on a page of links to various articles he has written.

I would like to share that PDF on Facebook, but the PDF doesn't give Facebook any Open Graph tags, so the link is rather ugly

Is there an Open Graph implementation for PDF URLs that would give his readers a good social media link besides creating a new page with text and images that are more Open Graph friendly?

10.01% popularity Vote Up Vote Down


Report

 query : Re: Replace website with newly redesigned and developed site without effecting SEO We are currently finishing off a development site dev.example.com for a business who currently have a website that

@Barnes591

Your hosting company gave you good idea about redirection (301).

Example:

Website with old content:


example.com/page-1.html
example.com/page-2.html


Website with new pages:


example.com/new-page-1.html
example.com/new-page-2.html


Assuming your page-1.html and new-page-1.html have the same content with new structure, same might be the situation for other URLs.

Now, in this situation, you need to make a list of old URLs and redirect (301 Permanent redirection) old URLs to its respective new ones. So, in above scenario page-1.html should be redirected to new-page-1.html

There will be no ranking impact if you do this. You may notice minor fluctuation but it should recover very quickly.

10% popularity Vote Up Vote Down


Report

 query : Are there other ways to manipulate the SERPs results aesthetics I have noticed that some forums are getting site links in the SERPs on desktop. When checking this URL in the structured data

@Barnes591

Posted in: #Seo #Serps

I have noticed that some forums are getting site links in the SERPs on desktop.



When checking this URL in the structured data testing tool there is no schema.

Has Google just plucked this information or are there other ways to display information in the SERPs?

10.01% popularity Vote Up Vote Down


Report

Back to top | Use Dark Theme