logo topseosmo.com

@Eichhorn148

Eichhorn148

Last seen: Tue 11 May, 2021

Signature:

Recent posts

 query : Re: Which URL is SEO friendly when it comes to non ASCII characters? /λαδοκολλα-περι-ανεμων /ladokolla-peri-anemon In the first URL I'm using the same business name, while in the

@Eichhorn148

never mind:) do it on the way, your users like better. Google is able to understand both of Latin and Unicode urls. In countries like Russia there are since many years three url types - latin, cyrillic (unicode) and transliterated. And Google understands and ranks all of them.

Do some measurements and research on your users: from where they are coming, is there much direct traffic, are they mostly living in greece or abroad? So you get understanding about how they input your page urls to navigate them, whether they have only usa/uk keyboard layout or the greece layout too, and so on. After you understand it you will adjust your urls so, if they are most comfortable for your users.

10% popularity Vote Up Vote Down


Report

 query : Re: Rewrite to SEO friendly URLs for language negotiation with type-map files on plain html files using .htaccess When visiting the website www.example.com/ the visitor should be redirected to the

@Eichhorn148

I would use rewrite rules to issue 301 permanent redirects:

RewriteEngine on

# First handle the index.XX.html files
# where the language code is moved to the front directory
# and the index and html are removed
RewriteCond %{REQUEST_URI} !^/?([a-z]{2})/
RewriteRule ^/?(.*/)?index.([a-z]{2}).html$ // [L,R=301]

# Then handle other file.XX.html files
# where the language code is moved to the front directory
# the file name is preserved, but the html is removed
RewriteCond %{REQUEST_URI} !^/?([a-z]{2})/
RewriteRule ^/?(.*).([a-z]{2}).html$ // [L,R=301]

# Handle all other file.XX.xyz
# where the language code is moved to the front directory
# and the file name and extension are preserved
RewriteCond %{REQUEST_URI} !^/?([a-z]{2})/
RewriteRule ^/?(.*).([a-z]{2}).([a-z]+)$ //. [L,R=301]


This detects a two character language code using the regular expression ([a-z]{2}). You could instead using something like (en|es|de) with a specific list of supported languages. Doing so could reduce mistakes based on two digit codes used for something other than language (like maybe a gzipped javascript file: .js.gz.) It would also allow you to expand the rules to include country info if needed (like en-us.)

All the rules have a RewriteCond that makes sure they don't apply in the language subdirectories. That should help prevent infinite loops or other problems

The rules start with /? which consumes any leading slash so the rules can be used either in .htaccess or in an Apache .conf file.

The first rule makes the directory path preceding index optional using a question mark in the section (.*/)?. It allows that rule apply to the root index.XX.html files.

I didn't put in a rule for the .var files because it sounds like you already have something in place for them. You would need to continue to handle those files as you do now.

I tested these rules on my local Apache server by putting them in .htaccess and hitting the site with curl. They seem to do the right thing:

$ curl -s --head 'http://example.com/index.en.html'
HTTP/1.1 301 Moved Permanently
Location: example.com/en/
$ curl -s --head 'http://example.com/index.es.html'
HTTP/1.1 301 Moved Permanently
Location: example.com/es/
$ curl -s --head 'http://example.com/about.en.html'
HTTP/1.1 301 Moved Permanently
Location: example.com/en/about
$ curl -s --head 'http://example.com/about.es.html'
HTTP/1.1 301 Moved Permanently
Location: example.com/es/about
$ curl -s --head 'http://example.com/images/logo.en.png'
HTTP/1.1 301 Moved Permanently
Location: example.com/en/images/logo.png
$ curl -s --head 'http://example.com/images/logo.es.png'
HTTP/1.1 301 Moved Permanently
Location: example.com/es/images/logo.png

10% popularity Vote Up Vote Down


Report

 query : Re: Schema and Rich Snippets for embedded YouTube videos in an onclick iframe player? My site has YouTube thumbnails that when clicked, load a 90% full screen overlay on the page with the youtube

@Eichhorn148

I see nothing, what would be saing against structured data for your videos. You page with videos looks like a kind of curated catalog, so you have all arguments to markup it with structured data.

With indexation it could be a bit complexer, because Google already knows all these videos - they are probably hosted at YouTube, correct?

If you really want to get them indexed on your page... I would invest some more afforts in it and create some unique content for each video - like custom descriptions.

Yes, i know - such file amount as you mentioned is not a childrens game, but, i mean, you should give Google a cause to mean this page is unique and useful. Some unique descriptions, some external backlinks - and you are done.

Maybe its a good idea too - to implement some comments (reviews) on the video page.

10% popularity Vote Up Vote Down


Report

 query : Re: Will Google penalize me for loading images that it can't see? (blocked by robots.txt) I have a script setup to load images for the visitor on another server. The robots.txt file on that server

@Eichhorn148

You ask two questions.

Firstly you ask about whether Google will penalize you.
The answer is: noway. If you don't want to show something to Google, so Google will not play a police, investigate, whether you hide something not according with law or ToS and penalize you for such.

Then you ask about Google would derank pages, where images are closed by robots.
The answer is: derank - not, rank MUCH lower - for sure. If a page with hidden images is allowed to ne indexed, than hidden content is a ranking malus. Google's ranking guidelines notify, that hiding of content is jamming the clear and full understanding of the pages content.

10% popularity Vote Up Vote Down


Report

 query : Re: Should I use VPS IP addresses or VPS domain names in the DNS Settings? I have recently applied a 'Let's Encrypt' certificate to my VPS. Whilst doing this, I came to learn that I could access

@Eichhorn148

Using A or CNAME records will have no effect on whether or not your site appears when you type in the IP address. Your site appears when you type in the IP address because every server MUST have an IP address assigned. DNS is then configured to point to that IP address. Even if you have no DNS entries at all, browsing to that IP address will get to your server.

I recommend showing a 404 Not Found error whenever somebody visits your site with an IP address or with an unexpected domain name. Anybody could point the domain name youstink.tld to your server. There is nothing technical you can do to prevent that. When that happens you don't want that showing your site, or even redirecting to it.

To make this happen under Apache, you need to configure two virtual hosts. One for your site, and a default virtual host. The one that comes first in the configuration, or alphabetically first if there is a separate configuration file for each virtual host is the one that Apache treats as the default. The default one can just return a 404 error. From How do I configure the default virtual host return a 404 header in apache?

<VirtualHost *:80>
ServerName default
Redirect 404 /
</VirtualHost>
<VirtualHost _default_:80>
Redirect 404 /
</VirtualHost>


Then your other virtual host should have your site configuration in it:

<VirtualHost *:80>
ServerName example.com
ServerAlias example.net
Redirect / example.com/ </VirtualHost>
<VirtualHost *:443>
ServerName example.com
ServerAlias example.net
....
</VirtualHost>

10% popularity Vote Up Vote Down


Report

 query : Re: Will social sharing help my website rank in search engines? I am owner of travel related website and recently found that two twitter users have shared a link from my website on their profile.

@Eichhorn148

Google has said many times that they do not use social shares as a ranking signal:


2014: Google's Matt Cutts: We Don't Use Twitter Or Facebook Social Signals To Rank Pages
2015: Google: Again, Social Signals Do Not Influence Your Ranking
2016: Social Sharing and Social Ads Have No Impact on RankBrain


SEO Moz points out that many webmasters notice that their most highly shared articles on social media also get the most search engine traffic. SEO Moz chalks this up to correlation with article quality and engagement. Better, more engaging articles get more shares, more links, and a high click through rate from the search results.

At best, social shares may attract users that end up linking to your content or start searching for your brand name. Social shares (and advertising) may indirectly help SEO through these mechanisms.

10% popularity Vote Up Vote Down


Report

 query : Re: Does Google use AJAX powered hash fragment links in sitelinks? On my website, I use anchor tags to navigate as it's a single page. With that in mind, my links for the main nav look like:

@Eichhorn148

I very doubt if it will be the case. Url parts after hashbang aren't transferred through server.

If i guess right, and your site is a onepager, so i get much more doubts about Google would establish sitelinks:(

But, if your site is not a onepager, you could easy change this behavior and get much higher chances to achieve your goal:


address your menu entries with parameter instead of hashbang, like ?m=about.
establish internal links to them from other pages, not from the startpage.

10% popularity Vote Up Vote Down


Report

 query : Re: Do you need to add 301 redirect rules into robots.txt for search engines when you specify them in .htaccess? Do I need to add 301 redirects to the robots.txt file? I will be adding the redirects

@Eichhorn148

To answer your question: no, no any redirection rules in the robots.txt file. Just because it is the wrong place for them. I would highly recommend to read standard specification of robots.txt here and there.


Robots.txt is for bots, crawlers, spiders, coming automatically to read and save the content of your pages.
Redirection rules are for server, to handle any/all requests, coming from human or software.


There is an approach, called cloacking, which was used by some not-so-white SEOs years ago. This approach is about special setup in the server configuration (like your htaccess) to recognize search engine bots by their user agents and redirect them to special location. According to this the human visitors are redirected to another special locations too.

The goal of this approach was: search engines get pages, which are SEO-optimized in such grade, that they are no longer comfortable to be used by human. Human visitors get served not-so-optimized pages, which however can be offhand used.

By now Google is very good able to recognize and penalize such behavior - cloacking violates Google's ToS.

10% popularity Vote Up Vote Down


Report

 query : SEO impact of using a Android app download splash page before the home page I have a splash page that user can decide to enter the website or download the android .apk file. I wonder that

@Eichhorn148

Posted in: #GoogleSearch #Seo

I have a splash page that user can decide to enter the website or download the android .apk file. I wonder that using the splash page have a bad SEO impact since its a static page with little keywords and related rich contents?

Does Google prefer to see updated content in the first page every time it crawls my website?

10.01% popularity Vote Up Vote Down


Report

 query : Google analytics: previous url in click-through path I'm trying to find out the previous url to the given url and get into a mess on this. I'm currently in analytics view Behavior→all pages.

@Eichhorn148

Posted in: #GoogleAnalytics

I'm trying to find out the previous url to the given url and get into a mess on this.

I'm currently in analytics view Behavior→all pages. The first column is page. My tries are to add a secondary dimonesion, and here what i get:


Destination Page - there i get the same url, as in page,
Next Page Path - same thing, same url, as in page,
Second Page - looks always like next higher directory. If Page in the first column is /1/2/3/, Second Page is /1/2/, if Page is /1/2/, Second Page is /1/.


My thoughts - something is wrong in analytics setup, or i just how things i'm lookig for are working.

Could somebody point me into the right direction? I need to know what is the next url, which was visited after the given url (given = primary page url in the view).

10.02% popularity Vote Up Vote Down


Report

 query : Re: During a website redesign, how will removing unused pages/URLs affect SEO ranking? I have been designing a new company website on Wordpress. Its design and content is all entirely different from

@Eichhorn148

Your previous SEO man's statement is correct. But there is another question: what would happen, if you drive your approach instead of approach recommended by your previous SEO man.

The answer on this question depends. I'll try to describe, which thoughts are behind the answer:

Lets imagine, your old pages, like website.com/subject-A-1, have external backlinks.

Case 1: you create a new website's exact counterpart for the old page, like new-website.com/subject-A-1. Than, 301-redirecting of the old url to the new one will route external backlinks to the new url with the same content too, and preserve the chain backlink→content.

Case 2: you don't create an exact counterpart for the old page. Nevertheless you establish a 301-redirect from the old url to the new one, as you describe in your approach. Doing so you do route external backlinks theoretically correctly to the new url, but the chain backlink→content will be broken.

In some cases it isn't bad to break this chain - for example in cases, where you equip the new page with more, longer and better content, as it was on the old page.

The desicion you'll make is based on two things:


whether and how many external backlinks are pointed to old pages,
what kind of content you would place on new pages.


Are there no or just a few backlinks and/or would you make the content on the new page MUCH better as on the old pages - don't hesitate to use the approach you propose.

Are there many backlinks and/or would be the content on new pages equal or purer as on the old pages - drive the approach by your previous SEO man and create for each old url a new counterpart.

10% popularity Vote Up Vote Down


Report

 query : Re: Will AMP version replaced with mobile-version in search result? We implemented AMP version of https://www.tarafdari.com two weeks ago. During this period we had 3% growth in session but 15% of

@Eichhorn148

If you have both of them, than yes. On having mobile version and AMP your AMP version will be used only for some special AMP-ized placements, like caroussel, top stories and the like. In this case you should keep in mind issues like placement of advertising (splitting), avoiding of duplicated content etc.

If you have only AMP as your mobile version, than Google will use it as your general mobile version. In this case you drive securily without any issue, caused by two site versions for the same purpose.

My personal way of choice would be to maintain AMP version as the only mobile version: this approach seems to me to minimize managing afforts. If your site doesn't sell (as far i can understand, maybe beside of subscriptions) you'll drive securely and without feature losses with AMP as the only mobile version.

10% popularity Vote Up Vote Down


Report

 query : Re: How to force Google and other bots to pick actual images and not thumbnails? For example, if there is an online shopping websites with thousands of small thumbnails of products and when you

@Eichhorn148

Link the thumbnails to the images for users that don't have JavaScript enabled. Like this:

<a href="full-image.jpg"><img src="thumb-nail.jpg"></a>


As long as users can see the full image by clicking, you can disable the click event on the link using JavaScript without incurring a penalty from Google.

Google treats links to images the same as actually having the full image in the page itself. Linking will get the full image to rank in image search and associate it with the correct page on your site.

You could also add a header to the thumbail images that prevents Google from indexing those:

X-Robots-Tag: noindex


If you have the Apache server you could use a directive in .htaccess like:

<FilesMatch ".*thumb.*.(gif|jpg|jpeg|png)$">
Header set X-Robots-Tag noindex
</FilesMatch>

10% popularity Vote Up Vote Down


Report

 query : Stats in Google Search Console dropped off after chaning domain and moving to HTTPS, but analytics traffic is steady I have two sites where the data in GWT has done this Site 1 past 90 days

@Eichhorn148

Posted in: #GoogleSearchConsole #Traffic

I have two sites where the data in GWT has done this

Site 1 past 90 days


Site 2 past 28 days


Notice the huge drop-offs in data - but when I look at any other analytics tools such as GA or the stats in Wordpress, everything is fine.

The first site has a massive dip in impressions and clicks, but position and CTR have recovered a little, but are still off.

The second site has just disappeared completely.


Is Google just having a bad day/week/month?
Should I expect this to just recover on its own?
Should I contact Google?

10.01% popularity Vote Up Vote Down


Report

 query : Re: Get search engines to index hash-based URLs I have a client who has website with multiple pages. Now, each page has tabbed content which is accessible through #. For example: example.com/page1#content1

@Eichhorn148

As it was mentioned in comments, the simplest possibility to address tabs with urls, is to address them with parameters instead of hashes, like www.example.com/page-1?tab=2.
While you correctly said, hashes will not be passed through the server to Google, parameters will.

A kind of solution is described under css-tricks.com/better-linkable-tabs/

10% popularity Vote Up Vote Down


Report

 query : Re: SEO - switching domains on an established site to a brand new domain I made a new site for a client that is a huge improvement in terms of asset optimisation, page loads, on-page SEO etc.

@Eichhorn148

Switching to the new domain could bring troubles to the new domain only if the previous domain is under penalty.

If not:


switch,
don't forget clean redirect policy,
make some afforts to re-point external backlinks delivering the most referral traffic from the old domain to the new one.


Thats pretty all - in some weeks, after Google recognizes all redirects and backlinks rankings will come again and rise higher than they were with the old domain.

Massive errors or lack of redirects from the old to the new domain could make the process of removing ratings for Google much longer - don't fail on it, check redirects twice before going live.

Re-pointing of backlinks from the old domain to the new could substantially speed up the re-arranging of ratings from the old to the new domain. If Google will not be forced to go through redirect, like backlink → old domain → redirect to the new domain - instead it goes the direct way: backlink → new domain, it will save up a jump and return the favor.

10% popularity Vote Up Vote Down


Report

 query : Re: Html5 tag video VS Youtube video Is it better to embed a Youtube video in an Iframe or to load the video on an HTML5 tag ? Which's better from an SEO point of view ? Is there any difference

@Eichhorn148

From the "pure" SEO point of view there is no difference between implementation variants you mention. Both of them are equally good or bad recognized by Googlebot as video files.

Much more important is containing of meaningful structured data to describe your video. Because Googlebot can't recognize without structured data about what your videos are.

Also important are the video title, description, categorization in your youtube account, where the implemented video is hosted.

10% popularity Vote Up Vote Down


Report

 query : Re: Does sitemap file need to have extension .xml Im using a cms that generates a sitemap automatically but without ".xml". IS this ok? Can we put this is our robots.txt: Sitemap: https://www.mydoamin.com/sitemap

@Eichhorn148

Not mandatory. Google's guidelines allow xml, txt and html formats. Look into guidelines to get further info.

Note: if sitemap has any other format as xml, it must be encoded as utf-8 and contain nothing more then urls one per line. Sitemap in xml format allows containing of some additional informations.

10% popularity Vote Up Vote Down


Report

 query : Re: What is the solution for "Submitted URL not selected as canonical" in Google Search Console beta? Google's new webmaster tools has a section 'Index coverage'>'Excluded'>'Submitted URL not selected

@Eichhorn148

You explicitely requested an indexing of some urls, through sitemap or webmaster tools, which are duplicates without canonical. The question is rather why do you want duplicated urls are explicitely indexed? It is not according to good SEO practices.

Set your duplicated urls to noindex or, at least set a canonical so Google knows what to rank instead of duplicated content - both of acts will solve the issue.

10% popularity Vote Up Vote Down


Report

 query : Re: How should structured data look for WordPress categories and tags? Working on rebuilding a WordPress blog with structured data I researched the site and ran across "Using Schema.org for blogging:

@Eichhorn148

Structured data doesn't help rankings. At best it can get Google to enhance the display of your site in the search results. Google maintains a gallery of all they ways they use schema.org markup in search results.

Google does allow you to mark up the breadcrumbs of a page for display instead of the URL in the results. You could choose either categories or tags for that. Categories would usually make the most sense because they have a hierarchy that would fit well into breadcrumbs. There is no benefit to marking up both categories and tags. I wasn't able to find data on how much breadcrumb markup might change the click through rate (CTR) from the search results. It certainly should have some effect, but I would expect it to be modest since most users click on the big blue titles and not on the green URLs or breadcrumbs.

Google says that they have an article carousel, but I've never seen that in my search results, so I doubt that your Article markup will have much effect. It would probably be most effective for newsy articles for which Google had determined that the queries deserve freshness (QDF). I think Google is most likely to show the article carousel for news related searches.

10% popularity Vote Up Vote Down


Report

 query : Why is Google reporting CSV files as "soft 404"? I have a few hundred soft 404 errors reported in Google Search Console. Almost all of them are for CSV files containing data. For example

@Eichhorn148

Posted in: #Csv #Google #GoogleSearchConsole #Soft404

I have a few hundred soft 404 errors reported in Google Search Console. Almost all of them are for CSV files containing data. For example here is the HTTP response for one of them:

HTTP/1.1 200
Content-Disposition: attachment; filename="fewer-bank-failures.csv"
Content-Length: 116
Content-Type: text/csv; name="fewer-bank-failures.csv";charset=UTF-8
Date: Thu, 01 Feb 2018 11:32:56 GMT
Server: Apache
Connection: keep-alive

"",Bank Failures
2000,2
2001,4
2002,11
2003,3
2004,4
2005,0
2006,0
2007,3
2008,25
2009,140
2010,157
2011,92
2012,51


Why is Google reporting that this is a soft 404? I've usually seen soft 404 because:


You have a "200 OK" status but say "not found" in the page
You redirect to the home page
The page is blank


I can't figure out why Google would think that this CSV file would indicate a not found error.

I do understand other reasons that Google might not want to index this content:


It is a download attachment rather then a page
CSV wouldn't be the best landing page experience
The content is duplicate -- we have the an HTML page with the same data including a graph.


I would expect Google to choose not to index the page for one of those reasons, but I am completely surprised that they call it a "soft 404".

What can I do to tell Google that the page is real? Would using a Link: <https://example.com/fewer-bank-failures.html>; rel="canonical" HTTP header help?

10.01% popularity Vote Up Vote Down


Report

 query : Re: Google Search Console - Submitted URL not selected as canonical We have 3 product pages in which we have added some internal links (PDF links), the crawl report Google search console is showing

@Eichhorn148

The main requirement for clean and automatic implementation is, that your pdf files have same file names as their according html pages, i.e.:


html page: /example-guide-1.html
according pdf file: /example-guide-1.pdf


After you got this structure into your file names you add something like following to your htaccess to add to all pdf files the canonical link to according html file:

RewriteRule ([^/]+).pdf$ - [E=FILENAME:]
<FilesMatch ".pdf$">
Header add Link "< example.com/%{FILENAME}e.html >;rel="canonical""
</FilesMatch>

10% popularity Vote Up Vote Down


Report

 query : Re: Discreetly remove page from Google index without modifying page HTML Let's just say a "friend" wants to do a guest blog on your site with a link to their site, but you aren't very happy about

@Eichhorn148

Beside of ethical aspect: yes, it is possible. Let the link be dofollow in the page source code, but send nofollow with X-Robots-Tag, like:

<FilesMatch "page.html$">
Header set X-Robots-Tag "nofollow"
</FilesMatch>


Googlebot reads the X-Robots-Tag earlier then the source code and, despite conflicting signals about this link, it would (hopefully and according to its behavioral logic) follow the signal it got earlier.

10% popularity Vote Up Vote Down


Report

 query : Re: Can I use my own domain at wordpress.com retaining my DNS outside WP? I have my own domain (e.g. contoso.com). I also have a (free tier) wordpress.com account, using contoso.wordpress.com.

@Eichhorn148

Wordpress.com has a support page about mapping a sub-domain to your site using their /year plan: en.support.wordpress.com/domains/map-subdomain/
Their instructions say that you need to do two things:


Add CNAME records in your DNS:


sub.example.com CNAME to example.wordpress.com

In your WordPress.com site add the domain to your site (example.wordpress.com) using the "Already own a domain" option and make it the "primary" domain.


Since you are using a bare domain name (no subdomain), you can't use a CNAME at the for the APEX record. To use an external DNS system, you will need to find one that supports "ALIAS" records. An ALIAS serves an A record, but it periodically looks up the IP address to serve from the alias that you set. I use Amazon Route53 when I need a DNS host with alias records.

You would want to set the following records:


example.com ALIAS to example.wordpress.com example.com ALIAS to example.wordpress.com (WordPress would redirect this for you according to their documentation)


You could also try just setting the A record, however if WordPress changes their IP address, your site could stop working without warning. The following are the current IP addresses for lb.wordpress.com which is what your site currently points to:


example.com A 192.0.78.12
example.com A 192.0.78.13 example.com A 192.0.78.12 example.com A 192.0.78.13


To avoid the redirect loop problem, you need to clear your browser cache after making DNS record changes. Your browser must have the redirect from your domain name to your wordpress subdomain cached. Once your site is hosted on the domain name you need to clear your browser cache to be able to access your site. Only people who visited your domain name while it was redirecting will have that problem.

I also noticed that WordPress.com has instructions for getting mail to work: en.support.wordpress.com/domains/custom-dns/ That says that they support DKIM txt records. You may be able to use wordpress.com DNS servers and still have the DKIM support you need.

10% popularity Vote Up Vote Down


Report

 query : Re: How do you generate SEO pages that match a large number of generic search phrases and locations? I see that some sites in the Google search results have a matching page for almost any combination

@Eichhorn148

this is a kind of SEO from middle age. Google punishs such sites with pretty low rankings. Specially Panda updates are targeting such sites. For the more keywords ranks a single one url, the better Google means about it and the higher it ranks. Got the idea? Not 1 keyword = 1 url, but 1 url = as more keywords as possible.

The technical part of doing such SEO spam like you described is pretty simple: something like a cron job pre-renders pages with placeholers, which are filled in from a keyword list/database.

10% popularity Vote Up Vote Down


Report

 query : Re: Real value of Google Ads app conversion rates We are facing a doubt when reading the technical details of the products and steps needed for publicizing our app and then measuring the app conversion

@Eichhorn148

As far my workmate, who is in CPA, explained it to me - it depends.
There are two kinds of Adwords campaigns for apps - installation and reactivation/usage. If you run installation campaign - Google would recognize the installation, if you run reactivation/usage campaign - Google would recognize app start.

10% popularity Vote Up Vote Down


Report

 query : Re: Showing the home page first in Google search results for searches for the domain name I have example.com built in WordPress, SEO done using YOAST, if you search "website.com" for it on Bing

@Eichhorn148

"Profile" isn't really a bad name for the home page, but it isn't complete. The home page could be named any of:


"Brand Name Profile"
"Profile of Brand Name"
"example.com Profile"
"Profile of example.com"


It isn't just the title of the home page that is important though. The page should have the brand name listed in text on the page. Most sites have a footer with the brand name. Right now you don't use the brand name or the domain name anywhere on the page. I'd suggest adding text to the footer:


Copyright 2018 Brand Name


The brand name should also be used as the alt tag on the logo. Your logo currently has alt="".

<img src=logo alt="Brand Name">


Once the brand name is actually used on the page, then Google will realize that the page is relevant for brand related queries.

Your suggestion of using "home" rather than "profile" won't make much difference. Text of the brand name and domain name are plenty of signal to Google.

10% popularity Vote Up Vote Down


Report

 query : Can screenshots of sports sites be used on a blog? I'm writing a blog on hockey news, analyzing things such as trade rumors, what players would fit on what teams, and what players are performing

@Eichhorn148

Posted in: #Copyright

I'm writing a blog on hockey news, analyzing things such as trade rumors, what players would fit on what teams, and what players are performing better than others. To do this, I want to use screenshots of statistics from Yahoo Sports. I would crop the image to get the picture of the player with their name and statistics of this season (not the whole page). The screenshots are related to what is discussed in my posts and it's a non-commercial blog. Does this fall under fair use? The country I am in has signed the Berne Convention. I would also provide links back to the webpage I took the screenshot from.

10% popularity Vote Up Vote Down


Report

 query : Re: Merge content of two pages in one page I have a website with millions of pages. Page 1: http://example.com/commercial-product-name1 Page 2: http://example.com/commercial-product-name2 page 3: http://example.com/Scientific-nam

@Eichhorn148

With millions of pages I see no chance for sustainable SEO.


First, I would noindex all pages which don't get traffic (Analytics to the rescue).
Second, I would run a broad keyword research to discover which products have search volume. Then I would optimize these product pages for keywords with search volume and allow them to be indexed.

10% popularity Vote Up Vote Down


Report

Back to top | Use Dark Theme