logo topseosmo.com

@Angela700

Angela700

Last seen: Tue 11 May, 2021

Signature:

Recent posts

 query : Estimating Google Cloud Platform costs dilemma I'm stuck between 2 things I'm not experienced at,,, 1- traffic capacity: with Google Compute Engine 96 Virtual CPU and 360 GB memory costs 2330$/month

@Angela700

Posted in: #Google #GoogleAppEngine #GoogleCloud #GoogleCompute #Server

I'm stuck between 2 things I'm not experienced at,,,

1- traffic capacity:
with Google Compute Engine 96 Virtual CPU and 360 GB memory costs 2330$/month but I don't know roughly how much traffic it can handle...

2- resources required:
in the other hand there is auto scale Google App Engine which google is offering a calculator to put estimates usage so I can predict the usage (traffic and give worst case scenarios) but I don't know the resources required,,,

any experienced webmaster out there please advise, thanks.

my website is a dynamic PHP web where users search saved posts and view them also there are APIs for Android and IOS apps lastly all the media files uploaded to Youtube and Google plus to avoid media files costs.

10% popularity Vote Up Vote Down


Report

 query : Re: Is AMP suitable for a non-periodical website? I am working on a project to convert some documentation into a mobile-friendly form. The current documentation is outdated and is using MediaWiki,

@Angela700

My answer to your question if AMP is a suitable format for non-periodical websites is: YES. AMP is targeted more on the page speed load of a website. Think of it as a mobile version with faster loading speed (responsive to all mobile devices) of your main site. I don't think it's only applicable to news sites. I am not sure if you've heard that Google is now using AMP on its email. So most likely, big G is going to roll out a notification requiring web admins to make their websites AMP validated. AMP is now integrated in the Google Search Console.

I think the best thing to do is to use a better CMS to your content (like Wordpress) to easily create the AMP version of your website. It will also be easier to manage content as opposed to the outdated MediaWIKI.

10% popularity Vote Up Vote Down


Report

 query : Re: Can misused Schema.org hurt SEO? I've inherited a site where I noticed that there is an excessive use of Schema.org in all navigation elements pages describing products are marked as itemtype="http://schema.org/CreativeW

@Angela700

2018 Update: Just to be clear, spammy or incorrect structured data CAN hurt your SEO.

Why? It's a manual action that Google will alert you with in your search console. If not addressed and fixed, your rankings WILL decline. We had a client who used structured data to markup 3 reviews plastered on a footer on every page. The result: a quick manual action which, after time, caused a noticeable hit in rankings.

Also think of Schema as a way to categorize your site. Say you're a restaurant but you mark yourself up as an Ice Cream Shop. Well suddenly, you're limiting yourself to a small niche audience when you could be reaching out to more people.

When using structured data always try to be as precise as possible without cutting out any of your potential audience.

10% popularity Vote Up Vote Down


Report

 query : Re: Should anyone be getting this error message from T-Mobile? A friend of mine was trying to sign into their T-Mobile account to pay their bill when they saw this strange message. Welcome, first

@Angela700

The port number of websites is standard. Port 80 for HTTP websites and port 443 for HTTPS websites. Everyone knows this so giving that information away is fine.

As for the server software. Apache is one of the most used web servers, and as long as the website team at T-Mobile are on top of things and keep their servers fully patched, then it shouldn't be a concern.

There is some argument that hiding what server software you are running provides some security benefit. But this is called security through obscurity, and most security people don't consider it a form of protection at all. You'd be much better off keeping Apache fully patched and up to date at all times than rely on something that offers no real protection against a hacker who knows what they are doing. You might fool a script kiddie but no one else. Basically, it is more important to keep things patched than rely on removing the information about Apache. Of course, hiding it as well as keeping things up to date won't hurt.

10% popularity Vote Up Vote Down


Report

 query : Re: Does Google index images declared using 'srcset'? As of Chrome 40, the srcset attribute is supported, but will Google index the images within it?

@Angela700

Update from Februari 2018. It seems like Google is still picking the src attribute initially. I added the higher resolution to the Image XML sitemap and it looks like Google is picking up those.

10% popularity Vote Up Vote Down


Report

 query : Re: SEO value in links from same hosting account I have a number of websites on my hosting account - all of these belong to me. One of these websites is a website development consultancy /

@Angela700

I suggest you put "nofollow" links to all of the websites you have designed. This will be useful especially if these websites are not in topical relevance to your website (which I assume they are; unless they're from any website design-related blogs). I don't think this will be considered a PBN attempt as long as you have properly indicated that "this website is designed by" or "website design by" or whatever.

10% popularity Vote Up Vote Down


Report

 query : Re: How to handle old posts & categories that are not in use I have a few old categories on my website, and I'd like to clean them up, but I'm a little uncertain how to handle two aspects

@Angela700

With the categories: You could clean them up and 301 redirect the old categories to the new or close related ones.

In regards to the posts: you can 301 redirect the ones that that are low-quality to higher quality or more relevant pages. Another alternative would be, to decide to rewrite or update them in a new post to improve the content and then 301 the old posts to the updated content.

For the snaps category I will recommend merge them all under the a category page and noindex, no follow the individual posts/images potentially to be classified as thin content. You can also create sub-categoryes like /snaps/2017/ or /snaps/cars/, etc.

10% popularity Vote Up Vote Down


Report

 query : Re: Is placing an image over H1 text an SEO issue? We have a main image on every page of a website. This image includes some text. We want to use text in image as h1 but to make it live

@Angela700

The <img> tag is classified by the living standard as Flow content, and <h1> </h1> accepts flow content. It is fine to use the image tag inside any heading tag.

In terms of SEO it will be an issue trying to present content visible to the bots but hidden from the user. According to google this is a direct violation of their guidelines. You can easily trigger a penalty by doing this if they detect algorithmically that you are trying to manipulate rankings that way.

However, you can still hide text from the user for legitimate purposes, such as accessibility.

According to WebAim there are a couple of techniques to hide content for the user but not from screen readers, this will allow people with disabilities to "see" the content but not regular users. This technique is about positioning the text off-screen:

.hidden {
left:-10000px;
}


There are some limitation of course, such as you can not use this technique if the image inside the H1 will be used as a link.

For more information about the techniques you can implement visit this page.

10% popularity Vote Up Vote Down


Report

 query : Re: 404 or 302 Redirect - what to use for a url which may be used in the future but not available at the moment My site lists blogs like this example.com/?status=blog&id=number I only have

@Angela700

My only concern would be how bots are finding or manipulating the URLs to find the not available URLs. To me, from an SEO perspective this is a crawling issue and potentially has an impact in performance and perceived website quality.

Had this issue been solved then we would not be having to take these considerations. If applicable, consider trying to fix the way bots are instructed to crawl your website and the way you are fetching the content from the database to avoid generating automated content or blogs without content.

After checking your code to understand how a bot is able to request such URLs, Try adding something like the following Allow rules before any Disallow to your robots.txt file:

.

.

Allow: /?status=blog&id=1/

Allow: /?status=blog&id=2/

Allow: /?status=blog&id=3/

Disallow: /*&id*


In the development stage of any website I always recommend people to retrieve some “coming soon” content, set up Google search console and even analytics, before publishing a new blog/site. This is to allow crawling and indexing the new website which is good for SEO purposes.

If for some reason you can not resolve the crawling issue, I would suggest doing nothing of the above solutions. By taking the first choice, you will potentially generate an almost infinite number of 404 errors pages and this, from the SEO perspective it is a sign of poor maintenance, bad UX and quality. 302 might be a good alternative but it will consume a lot of resources from you and from the bots’ servers (we don’t want them to get mad at us, remember Skynet). Since this is probably an auto generated content/URL I will suggest implementing noindex, nofollow meta tag approach and retrieving http 200 status response code. You can also add noindex, nofollow to the links pointing to the blogs. Well, you might say, “wait a minute, 200 ok header responses are still consuming resources”, true, but believe 3xx responses are more expensive.

10% popularity Vote Up Vote Down


Report

 query : Google caching wrong content for top ranking URL causing rankings to drop 12 pages When I use 'cache:example.com/category' Google is showing the homepage 'example.com' as the URL for the cached

@Angela700

Posted in: #Cache #GoogleIndex #GoogleRanking

When I use 'cache:example.com/category' Google is showing the homepage 'example.com' as the URL for the cached content.

If I Google search 'info:example.com/category' it returns a result for the homepage as well.

It seems Google is mistakenly indexed the homepage, as our /category page.
The page in question has taken a hit in rankings (from page1 to page12... page12!)

An interesting part is that if I do 'cache:' or 'info:example.com/category/' with a trailing slash Google seems to show the proper cache/info result and shows the URL without the trailing slash.

More info regarding the effected category URL:


the URL with a trailing slash is redirected to URL without trailing slash
no redirects for the URL in question (without trailing slash)
meta robots is follow,index
query params are disallowed (?color=red) and are self-canonicalized to the URL without query params anyways.
robots.txt is not blocking any resources or the URL to page
Google can fetch and render the page fine and Googlebot version is identical to user version


Response Headers are as follows:

cache-control:no-store, no-cache, must-revalidate, post-check=0, pre-check=0
content-encoding:gzip
content-type:text/html; charset=UTF-8
date:Wed, 31 Jan 2018 14:40:37 GMT
expires:Thu, 19 Nov 1981 08:52:00 GMT
pragma:no-cache
server:nginx
status:200
vary:Accept-Encoding
x-content-type-options:nosniff
x-frame-options:SAMEORIGIN
x-frame-options:SAMEORIGIN
x-sucuri-cache:BYPASS
x-sucuri-id:14018
x-xss-protection:1; mode=block

Platform we are using is Magento 1.9x

I have requested reindexing of the URL(without trailing slash) several times in past week or so, and while it seems Google has indeed crawled the page several times since then, it still shows wrong content for 'cache:' and 'info:'
Seemingly it still thinks the homepage is the proper canonical/content for the URL!

I'm not sharing actual domain URL for privacy reasons.

10% popularity Vote Up Vote Down


Report

 query : Re: Is it better for SEO to have multiple sub pages or all pages sumed up? I am doing a relauch of my site. example.de/service has 5 subpages. In my new layout I would love to put all subpages

@Angela700

The best approach depends on the content.

If each separate page (or sub page) contains enough paragraphs on a topic (almost like a storybook), then leave your setup as-is.

If however each page has very little content or even just maybe a few images by itself, then you're better off merging the pages together to form a story line so that more search engines can index your content.

The last thing you want guests to do is click the "next" button (or buttons similar to it) more often than viewing content. I bet that operation will strain some people's fingers over time. The point is, people want to see content they are looking for without making so many clicks.

And yes, if you decide to update the URLs, implement 301 redirects from the old URLs to the new URLs.

10% popularity Vote Up Vote Down


Report

 query : How do I do an htaccess rewrite on an internal URL without Google crawl giving me a soft 404 error? Here is my issue. I have a rewrite rule in my htaccess like this: RewriteRule ^(news|review|feature|editorial|podcast)/

@Angela700

Posted in: #CrawlErrors #Htaccess #UrlRewriting

Here is my issue. I have a rewrite rule in my htaccess like this:

RewriteRule ^(news|review|feature|editorial|podcast)/([0-9]+)/(.*)$ /article.php?id= [L]


Basically what it does is take in a URL like this:
www.example.com/feature/13889/negative-world-2017-goty-awards

And bring the user to this page:
www.example.com/article.php?id=13889

However, within the last year or so Google seemed to decide it doesn't like this anymore, and will return soft 404 errors when it crawls any of those original URLs. ("The target URL doesn't exist, but your server is not returning a 404 (file not found) error.")

Whether because of this or something else, a lot of my content is no longer indexed on Google, which led to a massive drop in hits for my site.

While reading up on this, I read something that said 301 redirects would solve this. So I switched the line to this in htaccess:

RewriteRule ^(news|review|feature|editorial|podcast)/([0-9]+)/(.*)$ /article.php?id= [L,R=301]


Should that solve the issue, or is there more that I have to do?! I'm a little worried because when I do "fetch as Google" it just says "redirected".

I guess my other question would be, would this explain why this stuff isn't being indexed anymore, or is there probably some other issue I should be looking into?

10.01% popularity Vote Up Vote Down


Report

 query : When I perform a who-is lookup on a .me domain name, why is it not able to find it despite the fact that it works? I know some people who are hosting a TeamTalk server on their Linux VPS.

@Angela700

Posted in: #Domains #WebHosting

I know some people who are hosting a TeamTalk server on their Linux VPS. They are using a .me domain. I wanted to see who their domain registrar was, as well as their nameservers, but I wasn't able to find anything for the .me domain.
I guess my queston is, can you make your own top-level domain and be your own domain registrar? If so, how does this process work? Do you need some sort of DNS software installed on your server?

10.01% popularity Vote Up Vote Down


Report

 query : Re: Does Classic ASP pages called with Querystring parameters are cached by the HTTP-level Kernel cache? Our company have a website made in Classic ASP, which most of them are static pages. We have

@Angela700

The Kernel Cache in IIS needs several conditions to be met by the requested resource in order to be cached. One of those conditions is that it should not contain any query string. You can check it in Microsoft's Knowledge Base here:
support.microsoft.com/en-us/help/817445/instances-in-which-http-sys-does-not-cache-content

if one or more of the following conditions are true, HTTP.sys does not cache the request response:

....


The request contains a query string.



It makes total sense, and that's exactly the use case that you need.

HTH

10% popularity Vote Up Vote Down


Report

 query : Re: canonical and place in country I have a website that is divided for the UK, thus I have it as for the UK in its entirety I have it as: example.org/index.php << parent site example.org/england/index.php

@Angela700

Yes, you need to implement hreflang to define the region and language. Learn more about it here.

I will also sugest that children for child pages be like

example.org/scotland/index.php << child page
example.org/scotland/wales/index.php << child page

Nothing that important, is just a preference.

10% popularity Vote Up Vote Down


Report

 query : Should I not 301 redirect when changing a business name and domain? From what I understand if I setup a 301 redirect for: www.joe.com > www.bill.com Then when people navigate to www.joe.com

@Angela700

Posted in: #Redirects #Seo

From what I understand if I setup a 301 redirect for:
joe.com > bill.com
Then when people navigate to joe.com they will be sent to bill.com.
This also allows search engines to transfer the page rank.

My question is what about people who find my company via typing it into search engines, since currently if they typed in 'Joe', joe.com would be the first result.

Am I right in thinking if search engines remove my old site from the index and people then search for 'Joe' then they will have no way of finding my new site? In this case of business name change and domain name change would it be better to not 301 redirect and instead direct users to the new site by clicking a link on the old site?

10.01% popularity Vote Up Vote Down


Report

 query : Static page which uses Ajax to fetch "Latest posts" on homepage from Wordpress I am making a website which has a blog as well as a "normal" website. Would there be any negative consequences

@Angela700

Posted in: #Ajax #Seo #Wordpress

I am making a website which has a blog as well as a "normal" website.

Would there be any negative consequences of installing Wordpress on a separate blog.domain.com and using AJAX to fetch information about latest posts from the WordPress site and loading this onto the regular html page? (I've not looked up the exact code which I'd need for this, but I'm pretty sure it's possible).

So basically, what I'd like to do is have two domains: domain.com which will consist of regular HTML pages, and a blog.domain.com. On the homepage of domain.com I want there to be a "Latest from blog" section, which I'd like to use AJAX to get the data for, as this is the only bit of the site which will change on a frequent basis.

Are there any disadvantages of this setup (SEO, etc.)?

/* Not really sure if this is the right SE site, but since it's basically "how to avoid using Wordpress" I feel like it's not right for the WP SE, and it's not really a programming question, so it doesn't belong on SO either lol. */

10.01% popularity Vote Up Vote Down


Report

 query : Want to know reasonable hosting platform for website I am working as project lead in a firm. This is my first project and it's quite big and important for me. We're creating a website which

@Angela700

Posted in: #LookingForHosting #SharedHosting #WebHosting #Webserver

I am working as project lead in a firm. This is my first project and it's quite big and important for me.

We're creating a website which will register users and accept their one time registration fees.

The website will cover news and events and other detailed stuff.

We're expecting 500k registration.

Almost 2K users will access at a given time.

We're using :-

CORE PHP

HTML

CSS

MySQL

JavaScript

I want to suitable web host for this project as I am afraid of word over and under.

Please suggest as I am not going for AWS.

Your help will be appreciated.

Thanks

10% popularity Vote Up Vote Down


Report

 query : Re: Server.Transfer method for mobile version I just want to be sure if i use server.transfer method for display mobile version of my website, is it good for SEO and google ? somehow i managed

@Angela700

As far as I understand Server.Transfer method is better than the old Response.Redirect method because its improve server request performance by reducing the number of requests. To me that is good enough and as long as the path parameter is clean you will always retrieve the necessary resources to load the page.

Responsive Design is Google recommended display method because the page URL and HTML stay the same. If your URL stays the same but the HTML changes google will classify this format as Dynamic Serving, whereas if URL and HTML changes Google will treat it as Separate URLs.

Each format has a different approach when it comes to SEO. The way you identify the user-agent is also important for Dynamic Serving, Google Recommends using Vary HTTP. In the case of Separate URLs you need to establish the relationship between the URL-Version-1 and URL-Version-2, here Google recommends using the using the <link> tag, where the Desktop version needs rel=”alternate” and the mobile version needs rel=”canonical” to link Mobile and Desktop URLs.

Please take into account that if you are using m.example.com vs example.com they are essentially different URLs.

Whatever your approach might be the bottom line is that you need the following to be always true:


Make sure the content and links on the mobile site are similar 90% to
the desktop version.
It is hightly recommended that the content is the same in both
versions (Remember Google Prefers Responsive Websites where URL and
content stays the same).
Make sure to have structured markup for both the desktop and mobile
version. Use the robots.txt testing tool to verify that your mobile
version is accessible.
Expandable content such as Read More will have more weight on mobile.
Page speed of your mobile site will determine the rankings of your
mobile site and desktop site.
Google has started to use the mobile version of the web as their
primary search engine index.


I highly recommend you to read what Google have said about going Mobile First

10% popularity Vote Up Vote Down


Report

 query : Re: Not use Google Analytics could penalize page rank in Google Search? I am using Yandex Metrika as tool for analytics. However I am concerned about the impacts in terms of SEO. For example. How

@Angela700

Google collects users behavioral data independently of websites having Google Analytics installed.

Types of user behavior data used by Google include click-through rate (CTR), navigational paths, time, duration, and frequency.

The way they do this is by tracking users behavior on each set of organic search results in the same way they do with paid ads, in simple words, they basically store in a database data associated with the use of all the links appearing in SERPs. Search engines also weigh the probability of a link being clicked in relation to its position and how this influences further user’s behavior on those websites a user visits, becoming arguably an essential part of how the algorithm works.

Be conscious with this idea, Google as a service provider will track users and their relationship with your website, not websites. Of course, they will crawl your website to classify the information it is providing in response to their users queries, but they will essentially track and try to determine their users level of satisfaction.

Picture this, an user query a search engine and clicks one of the links Google is providing, either the same user comes back to the same SERP and clicks another link, refine the initial query or never comes back. All these possible actions by the user are expected and being tracked to measure how well the result set is providing the right information in response to the user search intent. Google will make sure they are providing the right answer to their users, and reassess if necessary to provide a more tailored result set.

Google Analytics is a tool to provide information to you, it is part of an ecosystem Google have created to serve webmasters and of course provide some other benefits to them, but they will not rely on everyone to install Google Analytics to have the necessary information to improve their services.

Check out this video where Google clarifies that they do not use Google Analytics Data For Rankings

10% popularity Vote Up Vote Down


Report

 query : Re: Calculate conversion percentage in Google Analytics based only on visitors to the page with the conversion action I have promo website for an app. There is an "Install now" button on the index

@Angela700

I found a way to see conversion rate just for a page (considering sessions for that page only):

Behavior -> Site Content -> Landing Pages

Then choose a page you want to check (it is just a slash "/" in case of index page).

10% popularity Vote Up Vote Down


Report

 query : Re: Will adding a 'nofollow' attribute to links on my homepage result in crawling issues? On my homepage I have three links that link back to my homepage. That means that a loop of links is

@Angela700

Do not use nofollow attribute as a workaround to the JavaScript error, even if you implement the nofollow attribute you will still have the same error.

Nofollow attribute is there to control indexing and serving by search engines. Most search engines will interpret the nofollow link in different ways, affecting the overall site structure and how they understand the information you provide.

For example, rel=Nofollow links has been created to help webmasters to tell Google when NOT to pay attention to external references, documents or web pages related to the information they are sharing and they believe will not help the user or do not endorse. Also, if you are paying for ads with links to your site and that are surrounded by content or other ads that you don’t want to be associated with your website you use nofollow links. Also, if you read this post you will notice that you can tell Google not to crawl specific section of your website that you don’t really need to index such as authorised users only pages where login is required.

The homepage must be accessible or crawlable from everywhere in your website, it is expected. We don’t really know the exact consequences, but you will definitely will have crawling issues and errors will be reported in search engine Console.

Google expects you to use nofollow in certain ways, is the way he understands the content you provide, not to use it in cases you want to overcome a JavaScript issue. Any use of this property other than those explained in the above link will be seeing as an attempt to manipulate PageRank, even if that was not your intention.

10% popularity Vote Up Vote Down


Report

 query : Re: How to avoid multiple results in google search for same page if some results contain utm_source parameters? We have an old domain we are slowly turning off and are currently redirecting pages

@Angela700

This should never happen. The only way for a page to show up in search with tracking parameters is because campaign parameters have been used in internal links.

Campaign tagging is exclusively meant for external links that point back to your website. I could go into all of the reasons why campaign tracking can be incredibly valuable for marketers, but the bottom line is that utm_source parameters allow a website owner or marketer to change or reassign the source and medium data.

So let's say that you tagged an external link to look something like this: www.yourwebsite.com/100s-of-ways-to-trash-analytics-data/?utm_medium=social&utm_source=facebook.com&utm_campaign=sample+campaign
A user see's this link and clicks it from a social media management tool such as hootsuite. Because the source was modified or "reassigned" to be facebook (utm_source=facebook.com), facebook will now appear as the source (vs. hootsuite). Furthemore, instead of the visit showing up as a referral, it will appear as social because that is how the medium was assigned (i.e. utm_medium=social).

So for whatever reason, some misguided marketers add campaign paramters to their internal banners, and even the navigation bar and/or within links in the footer of the site.

So let's say that someone reads you profile page on stackexchange.com, and clicks to your site, followed by clicking the banner that inadvertently, intentionally, or ignorantly has campaign paramters attached to it. Upon clicking the banner and landing the contact page, that visitor no longer shows up as coming from Stackexchange.com - Rather the end result is that the referral is from your own site. It has been overwritten.

Here is what you need to do:

Option A: If you a member of Moz, they have a decent Crawl Tool

Option B: If you aren't a MoxPro member, Try out Screaming Frog


Visit ScreamingFrog and Download the software
Enter your website's URL and initiate a crawl
Once your crawl is complete, click the "internal" tab, and filter by "utm_" to see URL's which have been tagged with campaign parameters. You technically should not see any URL's upon executing this task.
If you do happen to see URL's after filtering the results, then you have one or more pages on your website linking to pages tagged with campaign parameters.
Find all of the pages which are linking to utm_ tagged pages. To do this, click on a URL in the top area, and chose the "links" tab towards the bottom area of the window. Notice the "from" column, as it should show all of your internal pages that linked to tagged URL(s).
Take a look into Google Analytics. Make sure you don't have ANY URL's that aren't being tagged properly. If you have tagged appropriately, you will not find campaign parameters (i.e. utm_) showing up within ANY content reports.


Lastly, I always recommend following Google's Advice when it comes to all things online. Here you can find information about URL mapping, which includes how to update internal links (see section 2 under the subsection "Update all URL details."

I'm not sure how far along this process you are, but relying on analytics may not be the most trustworthy way to go - but without more information, there is just know way to advise you any further.

Best of luck!

10% popularity Vote Up Vote Down


Report

 query : Home page has a youtube video on it, unsure how to fix google's description A website i'm working on has embedded on it's homepage a youtube video. For some reason, google is ignoring my meta

@Angela700

Posted in: #GoogleSearch #WebCrawlers

A website i'm working on has embedded on it's homepage a youtube video.
For some reason, google is ignoring my meta description and instead it is showing as a description the following message: "If playback doesn't begin shortly, try restarting your device. More videos. An error ocurred. Please try again later".

I noticed this when trying to google the company's name.
Any thoughts on how to fix this? Basically make google show my meta description instead of playback problems. The video isn't even the first thing on the home page.

10% popularity Vote Up Vote Down


Report

 query : Re: How to index pages with tabs for SEO I have a page with five tabs. The page will load with the first tab opened. The remaining tabs are only visible by clicking the respective tabs; these

@Angela700

Contents that are hidden in tabs are still read by the search engine bots. In the previous years, tabbed content might have some negative connotations. This is also applicable to toggles or accordion. You can check if the content is readable by the search engine bot using Browseo. You can also check it in the Google Webmaster Tools - the Fetch as Google tool.

check this out www.searchenginejournal.com/google-says-now-ok-put-content-behind-tabs/178020/

10% popularity Vote Up Vote Down


Report

 query : Re: Using 301 redirects to change domain and maintain SEO I’m setting up a Wordpress website with a new domain name to replace my client’s current website. I’d like to redirect the old domain

@Angela700

1) By applying 301 redirects you will basically tell search engines that you have moved that page or group of pages to another location permanently. No problems here, all good.

Google has declared several times that 301 won’t have an impact in transferring the ranking power from those links pointing to your old URLs. So to answer your first question, rest assured you won’t loose any rankings that way.

However, you might loose ranking from potentially more than 200 other ranking factors that will have to be reassessed once you have the new website live. You might loose rankings for changing the layout, website structure and design, having new content, etc. You might also loose ranking even by changing servers, the website will have a different location to serve content from, impact in performance will also be playing an important part.

What is expected before someone decides to migrate, upgrade or improve any website, is that we know our customers needs or we have learned from our past experiences enough that any chance we do will have a positive impact in the business or most importantly to our user base. But unfortunately that is not always the case.

2) the htaccess code looks good, but don’t take my word, you need to test it because each environment is different. Make sure that when you are trying to redirect any URL, specially the homepage, you don’t do more than one redirect.

So for example, if you are redirecting example-old.com to example-new.com make sure you don’t redirect first to example-old.com and then to example-new.com.

Make tests so you can be sure that either Or non version of the website will redirect always to example-new.com

10% popularity Vote Up Vote Down


Report

 query : Website redirects to httpss (not https) in Safari and IE but not Chrome I have made a redirect rule in my .htaccess file for https://pg-flowsolutions.com that specifies that if someone enters

@Angela700

Posted in: #301Redirect #Htaccess #Https #Redirects

I have made a redirect rule in my .htaccess file for pg-flowsolutions.com that specifies that if someone enters an invalid path (for instance pg-flowsolutions.com/blablabla) the visitor should be redirected to the websites index.

Now this works fine in Google Chrome, but in Safari and IE I'm experiencing that an extra 's' is appended to the address bar, so the browser is attempting to open httpsS://pg-flowsolutions.com - which obviously creates an error.

So the error I am looking for help to solve is where the extra 's' that is included in the address bar when opening a invalid URL actually comes from.

Is it something in my .htaccess file that causes this, can the extra 's' be caused by some server settings in Apache, or is it perhaps something that can be corrected in Wordpress even?

I have not been able to identify what causes this behaviour, my .htaccess file is the suspected culprit but I have not been able to find any errors there.

It currently forces all HTTP visitors of the site to the HTTPS version, and also it is supposed to redirect all attempts to open an non-existing page to the index site.

As mentioned, this works for me in Chrome but not other browsers.

My .htaccess file looks like this

<IfModule mod_rewrite.c>
RewriteCond %{HTTPS} off
RewriteRule .* %{HTTP_HOST}%{REQUEST_URI} [L,R=301]
</IfModule>

<IfModule mod_rewrite.c>
RewriteEngine On
RewriteBase /
RewriteRule ^index.php$ - [L]
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
RewriteRule . /index.php [L]
</IfModule>

10.01% popularity Vote Up Vote Down


Report

 query : Re: Play ".ch" extension stream on HTML I am trying to play .ch stream from a html player. But can't find the solution. .ch streams can be played successfully in Android via "ExoPlayer". But I

@Angela700

.ch is nothing more than a ccTLD, which represents the country of Switzerland. From my knowledge and research, there is nothing inherently special about .ch, or any domain extension for that matter when it comes to streaming.

There is, however, a domain extension which is marketed for streaming music and/or video. That extension is none other than: ".stream". Here is what Godaddy has to say about (dot)stream.

Look it, at the end of the day, domains are domains. Use .ch if you are intending to target users of Switzerland, as that is the purpose of this ccTLD.

If you want to buy a .stream domain name - there is nothing special about it other than the extension which implies your intent to "stream".. if you will.

10% popularity Vote Up Vote Down


Report

 query : Re: HTTP to HTTPS redirect: How not to create an infinite loop I have a WordPress install on a subdomain: https://blog.example.com To enforce SSL I have the following redirects in my .htaccess: <IfModule

@Angela700

I was able to resolve it by changing the .htaccess like so:

<IfModule mod_rewrite.c>
RewriteEngine On

# BEGIN FORCE HTTPS
RewriteCond %{HTTPS} !=on
RewriteCond %{ENV:HTTPS} !=on
RewriteRule .* %{SERVER_NAME}%{REQUEST_URI} [R=301,L]*
# END FORCE HTTPS

# BEGIN WordPress
RewriteBase /
RewriteRule ^index.php$ - [L]
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
RewriteRule . /index.php [L]
# END WordPress

</IfModule>

10% popularity Vote Up Vote Down


Report

 query : How to stop Bot generate Traffic in Wordpress? My WordPress website recent facing a big problem. Spamming problem. Using a plugin i detect that every single second here coming huge number of

@Angela700

Posted in: #Googlebot #SpamBlocker #SpamBots #SpamPrevention

My WordPress website recent facing a big problem. Spamming problem. Using a plugin i detect that every single second here coming huge number of traffic which is fake and send by BOT. The traffic also not referral, not showing in Analytics. For this problem the CPU usage being 100% in seconds. like this the visitor coming...



Also there have some plugin but where the link need to put manually. But it's not actually possible. because here Coming huge number of sites in seconds. How can i solve this?

10% popularity Vote Up Vote Down


Report

Back to top | Use Dark Theme