logo topseosmo.com

@Hamaas447

Hamaas447

Last seen: Tue 11 May, 2021

Signature:

Recent posts

 query : Re: Install WordPress in a subdirectory of MODX CMS For a client that need a re-design and a Plugin that exist only for WordPress I would like to install WordPress in a subdirectory of a website

@Hamaas447

The directory structure will be something like /home/user/www (or public_html) and that is where you likely will find modx.

Create a subdomain new.example.com and install wordpress in there. Your wordpress files should be in /home/user/www/new.example.com

You will be able to work on WordPress at new.example.com/wp-admin

When you are done, backup your modx site, delete everything (except the backup and the wordpress directory of course) and move all the files from /home/user/www/new.example.com to /home/user/www/

I find that sometimes it is necessary to go into the dashboard and simply resave permalinks to make it all work properly.

10% popularity Vote Up Vote Down


Report

 query : Re: SEO focused navigation I am a web designer and front-end developer. I almost always utilize a primary navigation bar that clearly demonstrates traditional website pages - "Home, About Us, Blog,

@Hamaas447

putting "Ohio" on the navigation menu (menu label) is such a spammy SEO tactic. It might work before, but Google bots are smarter. Apart from that, it also looks spammy on the user's eyes. It's kinda off.

However, if the "ohio" term is used on the URL or the permalink of the page (domain.com/ohio-service-here) and on its meta title but not as menu label, it's good for SEO.

10% popularity Vote Up Vote Down


Report

 query : How can I block who.is and archive.org from getting information of my website with htaccess? who.is is a service that gives people whois information of websites and archive.org automatically saves

@Hamaas447

Posted in: #Htaccess #SpamBlocker

who.is is a service that gives people whois information of websites and archive.org automatically saves people's website. How can I block them using htaccess from getting access with my website?

10.02% popularity Vote Up Vote Down


Report

 query : Does Google use AJAX powered hash fragment links in sitelinks? On my website, I use anchor tags to navigate as it's a single page. With that in mind, my links for the main nav look like:

@Hamaas447

Posted in: #Google #Links #Sitelinks

On my website, I use anchor tags to navigate as it's a single page. With that in mind, my links for the main nav look like:

www.example.com#about https://www.example.com#contact www.example.com#pricing

It's my hope that Google will see this navigation and build the sitelinks on my search listing from them but if they don't count as internal links then I don't think this will happen.

Could someone clarify, do they count as internal links or should I split the site into separate pages?

10.02% popularity Vote Up Vote Down


Report

 query : Dynamics urls - redirects or rewrite and page SERP position I'm trying to get my head on this, our site have thousands of links, is an e-commerce site, where products titles require changes

@Hamaas447

Posted in: #301Redirect #302Redirect #Redirects #Url #UrlRewriting

I'm trying to get my head on this, our site have thousands of links, is an e-commerce site, where products titles require changes on regular basis to get the best of SEO... you know, as usual, we build the url from the product title cleaning all weird characters and to make it seo friendly url.

However, all the pages are dynamically generated, means, if someone of the data content team changes the title name of that product url will change, which mean:


if the old url has a good position on SERP that page will be lost? meaning we lose the SEO, ranking...
How we keep the same page SERP position?
Does it needs a redirect or rewrite?
If it's a redirect do we need 2 pages? page A as old link and page B with new Link? don't think so?
so then the sitemap needs to contain the 2 url?
what is the best practise for dynamic url change?

10.01% popularity Vote Up Vote Down


Report

 query : Re: Does a "Disallow:" rule with nothing following it in robots.txt block my entire domain? If my website's robots.txt has the following code, does that mean my entire site is being blocked from

@Hamaas447

The rule as you have it up there allows all user agents to access everything. If you wanted to block all user agents from accessing anything, you would add the forward slash:

User-Agent: *
Disallow: /

The forward slash stands for your root directory. This command disallows all user agents from accessing your root directory or anything that it contains - which is everything.

10% popularity Vote Up Vote Down


Report

 query : What is `/&wd=test` URL that is being requested from my site, probably by bots I'm seeing error logs on a website because something tried to access: example.com/&wd=test the HTTP_REFERER

@Hamaas447

Posted in: #AspNet #WebCrawlers

I'm seeing error logs on a website because something tried to access:

example.com/&wd=test

the HTTP_REFERER is www.baidu.com/s?wd=FQQ
the error is from ASP.net:


A potentially dangerous Request.Path value was detected from the client (&)


The & char is not allowed in that position, it is allowed only after the ? char.

I'm wondering why are these hits happening. Is this a Baidu feature or is it bad bots?



UPDATE:
I checked some of the ips using www.abuseipdb.com and I see other websites are reporting these ips as web attacks,

here are examples:
www.abuseipdb.com/check/111.206.36.143 https://www.abuseipdb.com/check/111.206.36.13

most reports about the &wd=test but there is some other stuff too

10.02% popularity Vote Up Vote Down


Report

 query : Re: Will a permanent redirecting domain appear in Google search results? I have a domain A which will permanently redirect to domain B. May I know if there is a chance domain A will appear from

@Hamaas447

Google will still reindex old domain (DOMAIN A) if it doesn't refreshes its database. There will be a dance on your rankings but as long as a proper 301 redirect is applied, it should be back on track. Also, submit new redirected URL (DOMAIN B) to Google's search index for faster indexation (fetch as Google in the Google Search Console).

10% popularity Vote Up Vote Down


Report

 query : Linking Subdomain to Shared Windows Business Hosting I have my main domain on GoDaddy. And have purchased a Business Shared Windows Hosting from BigRock as GoDaddy has only 200mb limit on MS

@Hamaas447

Posted in: #AspNet #Godaddy #SharedHosting #SqlServer

I have my main domain on GoDaddy.
And have purchased a Business Shared Windows Hosting from BigRock as GoDaddy has only 200mb limit on MS SQL DB.

How can I link a subdomain I create on my main domain to the Business Shared Windows Hosting on BigRock; so that all my ASP.NET source code & MS SQL Database will be on BigRock (Business Shared Windows Hosting) however, it will be accessible via the subdomain (GoDaddy)

10.01% popularity Vote Up Vote Down


Report

 query : Re: Which is better for SEO, redirecting to specific page or home page? I have two websites: www.mainwebsite.com and www.otherwebsite.com, but the latter will no longer be used so I'd like to redirect

@Hamaas447

Search engines like contextually sound redirects. They have indexed your old site and know the content. If you redirect it to the home page, which may not have contextually similar content, you may actually get less of a boost, in the end, than if you redirect to a fitting page on your new site. And of course, it's also better for UX, no question.

What it comes down to is, are you going to bulk redirect your whole old domain to the main one, or are you going to take it page by page, or category by category? If you bulk redirect, I recommend redirecting to the page that makes sense, not the home page. However, if you're willing to do more work and redirect pages and categories individually, you can do your SEO some good by spreading it around. So you'd still redirect most of your pages to the specific page on the new site, but you might redirect some of your old pages to the home page, contact page, maybe other pages too. That would be better for your SEO, and hopefully your users too.

10% popularity Vote Up Vote Down


Report

 query : Re: Using Product Structured Data when not selling online I'm currently working on a site that sells expensive but very niche products that have to be specified to a certain specification before

@Hamaas447

Structured data is used to tell search engines the info about your website, pages, events, products, etc. that they would otherwise have to infer, or may not find at all. That markup, when implemented correctly, frequently (but not always) winds up as rich results for your site when it appears in SERPs. This markup also becomes more important the bigger the Internet of Things gets - after all, structured data is basically metadata.

If you don't care about rich results on your products in SERPs, you don't really need the structured data on those products. However, you might still want to include structured data on your website, for marketing and branding purposes: your "corporate" info, contact info, social links, locations served, etc.

10% popularity Vote Up Vote Down


Report

 query : Re: Quickest way to let Google know a site no longer has SSL? I've removed SSL from a website, permanently. I would like Google (and other search engines) to pick this up ASAP as of course

@Hamaas447

While there isn't a way to tell search engines that, specifically, your security protocol has changed, and I agree with @Stephen that you should have left the secure version up, here's a general checklist that would also apply to other scenarios of website/URL changes and canonicalization.


Do resubmit your XML sitemap to Google and Bing, but make sure that it has the updated URL's in it only.
Make sure that your old URL's 301 redirect to your new URL's.
Make sure your canonical tags on every page point to the new version.
Update your Google Search Console and Bing Webmaster Tools to ensure you're tracking both versions, just in case.
Update your Analytics too.
Go though the website and make sure all the internal links pointing to various pages are updated with the new URL.

10% popularity Vote Up Vote Down


Report

 query : Is it better for SEO to have multiple sub pages or all pages sumed up? I am doing a relauch of my site. example.de/service has 5 subpages. In my new layout I would love to put all subpages

@Hamaas447

Posted in: #Redirects #Seo

I am doing a relauch of my site. example.de/service has 5 subpages. In my new layout I would love to put all subpages in the parent page (/service) together. But Google already indexed all subpages.

What should I do to not loose this Google "power"?

Should I leave it as it is and live with the subpages, or can I do redirects from the subpages to the /service page without loosing with Google? And if I am "losing", how bad is it?

10.02% popularity Vote Up Vote Down


Report

 query : Re: Allow a folder and disallow all sub folders in robots.txt I would like to allow folder /news/ and disallow all the sub folders under /news/ e.g. /news/abc/, /news/123/. How can I do that please?

@Hamaas447

User-agent: *
Allow: /news/$
Disallow: /news/


Explanation:

Google's robots.txt spec (https://developers.google.com/search/reference/robots_txt), which is more up to date than the "official" spec, states that:

/fish/ will match anything in the /fish/ folder but will not match /fish (and, no wildcard necessary, since "The trailing slash means this matches anything in this folder.") If you kinda reverse engineer that:

User-agent: * (or whatever user agent you want to talk to)
Allow: /news/$ (allows /news/ but the $ character says the allow can't go beyond /news/)
Disallow: /news/ (disallows anything in the /news/ folder)

Test it in Google Search Console, or in Yandex (https://webmaster.yandex.com/tools/robotstxt/) to ensure it works for your site.

10% popularity Vote Up Vote Down


Report

 query : Why is the sitemap validator returning an error for the "lastmod" tag? The sitemap of my company's brand's website seems to be fine when I look at it at first glance. I don't particularly

@Hamaas447

Posted in: #DateFormat #Sitemap #Xml #XmlSitemap

The sitemap of my company's brand's website seems to be fine when I look at it at first glance. I don't particularly see any errors as such. However, SEMrush, the SEO tool, reports that our sitemap is not in the correct format. SEMrush doesn't tell me what's wrong. It just says that the format is incorrect.

So, I submitted the sitemap to this online validator:
www.xml-sitemaps.com/index.php?op=validate-xml-sitemap&go=1&sitemapurl=http%3A%2F%2Fwww.photojaanic.com%2Fsitemap.xml&submit=Validate+Sitemap
It says no issues found. But, when I run it through another tool:
tools.seochat.com/tools/site-validator/
It returns several errors for the lastmod tags. Here's one of them:




<url><loc>http://www.photojaanic.com/photo-gifts/keychains</loc>
<lastmod>2017-07-24T04:37Z</lastmod><changefreq>yearly</changefreq></url>


Error 1826: Element '{http://www.sitemaps.org/schemas/sitemap/0.9}lastmod':
'2017-07-24T04:37Z' is not a valid value of the union type
'{http://www.sitemaps.org/schemas/sitemap/0.9}tLastmod'. on line: 5 column: 0


Could this be the reason why the sitemap is wrong? I wonder why the date-time format is wrong.

I also tried another validator. Even that one returns many errors along with the lastmod issue:
freetools.webmasterworld.com/tools/site-validator
I'm not so concerned about the other errors as they're related to video or images. That could be rectified. What I'm concerned is about the lastmod tag.

A way to get around the lastmod issue could be to simply exclude the time. However, I'd like to know why there's an error.

10.02% popularity Vote Up Vote Down


Report

 query : Re: How can I speed up my wordpress website? My website is based on a Wordpress and I want to know why its loading metrics are so bad. What can I do practically? I've already tried to use:

@Hamaas447

The most practical thing you can do is:


Optimize your images
Remove query strings from static resources (there's a plugin for this)
Set expire dates on your headers (I think you can do this with Cloudflare)
Choose your hosting provider wisely.


If you're hosting on a shared server somewhere like GoDaddy or Bluehost than there's not much you can do. I would suggest looking into a managed VPS host like Cloudways.

10% popularity Vote Up Vote Down


Report

 query : Tell search engine that current text is temporary I have summary of my last articles on website first page and category pages. As you know this summaries are temporary and they're gone when

@Hamaas447

Posted in: #ChangeFrequency #Seo

I have summary of my last articles on website first page and category pages. As you know this summaries are temporary and they're gone when new articles are published.

The problem is when I search title of some of those articles on Google, Google indexes me first page or category pages beside of the article page as a result.

How can I tell Google that this text is just a summary and it's gonna be gone after a while?

10.02% popularity Vote Up Vote Down


Report

 query : Re: How to get my site spammed? I know this might be an unusual question as most people will usually want the opposite as what I’m asking for, but I want to run an experiment and I would

@Hamaas447

I wouldn't use WordPress for a honeypot, which is what I think you want to do. I'd just build a quick informational site with a commenting system that doesn't do email verification or captcha. Also put an open mailto link on it for people to use if they have problems posting. Post a link to the domain wherever you can, as well as a link to the email address. Then disable spam filtering, and you should be swimming in spam in a few days.

10% popularity Vote Up Vote Down


Report

 query : Crawl and structured data errors on non-existent webpages I have errors reported in two sections of my google search console: 1) Structured data It indicates that the following pages were missing

@Hamaas447

Posted in: #GoogleSearchConsole

I have errors reported in two sections of my google search console:

1) Structured data
It indicates that the following pages were missing several information like entry-title, updated and author.
/?p=6
/?p=1
/?m=201710

2) Crawl error
It indicates that "Googlebot couldn't access the contents of this URL because the server had an internal error when trying to process the request. These errors tend to be with the server itself, not with the request." for the following link:
/?cat=1

This results in an response code of 503.

What I am confused about is that the above links do not even appear on my sitemap.xml file? So I can't understand how to fix it.

Has it got to do with google not having the most updated sitemap.xml file of my website? If so, how should I get them to retrieve the most updated version?

10% popularity Vote Up Vote Down


Report

 query : Re: Is it illegal to create a website that allow a user to upload a file and make it accessible and downloadable for others? I am working on a website in which a user can sign in and upload

@Hamaas447

Building a website where people can share files is not illegal.

If people upload content where the copyright is owned by someone else and they don't have permission to share it, then that is illegal and they are responsible.

If you then share that content, you become responsible.

10% popularity Vote Up Vote Down


Report

 query : Re: Does Angular browser language detection and redirection affect SEO? I'm building an Angular app that supports English and Japanese. The main site: domain.com, is displaying in Japanese. The other

@Hamaas447

The following might be the factors why Google showed you the English version of the website:


Browser Language - if you have set the default language of your browser to EN, most likely, Google will show you the English version of the site
Location of your IP - despite the fact that you use the Google.jp, if Google detects your local IP is not in Japan, you will still be served with the English version.
Cache version of the site - Google puts cookies in every search (especially when logged in, even in incognito mode).


so, why not try the following:


clean your browsing history; use a premium VPN (if you are not physically in Japan); or
ask a friend/colleague in Japan to do the manual search
try different browser (set the default language to JP)

10% popularity Vote Up Vote Down


Report

 query : After a domain name expired and goes into a grace period, what happens to the WHOIS information on it? I have a domain name that is expiring in a few weeks. I want to delete the domain

@Hamaas447

Posted in: #DomainRegistrar #Domains #Whois

I have a domain name that is expiring in a few weeks. I want to delete the domain name right now, but the Registrar company will only let me delete it by letting it expire. I have purchased a WHOIS protection that hides my contact information on the domain name.

If I let my domain name (along with the WHOIS protection) expire and it goes into a grace period, will my WHOIS contact information still be on it?

10.02% popularity Vote Up Vote Down


Report

 query : Re: Purchase domain and web-hosting from different company I know that it is possible to purchase domain from one company and web-hosting from other, but I have never done that yet. Its first time

@Hamaas447

But how the company B knows that I am real owner of this domain.


Company B hosting company is not tied at all to domain names, Company A holds the registration and DNS settings.


What If someone already purchased web-hosting for this domain in company B.


Domains can only be registered for one company/individual, when you purchase that domain it is your for the remainder of the registration term.


It means that I would not be able then to purchase webhosting from
this company for my domain? Or how does it work?


You can purchase hosting from any company that you which, you can even host your setting from own computer if you have a static IP address.

Domains

When you purchase a domain, it gets registered with that company(the domain registrar) that company sends all the domains they have registered to the domain name registries.

DNS settings

Keeping it simple DNS settings routes visitors from your domain name to your Server.

Hosting

Keeping it they provide servers to host your site So go to your domain registrar and change the DNS settings to point to your hosting server on the other company.

10% popularity Vote Up Vote Down


Report

 query : Re: Would title tags that are only slightly different still be considered duplicate titles? I know that it's bad to use duplicate content, including duplicate titles. Does duplicate titles mean that

@Hamaas447

With a quick look, the pages seem duplicate. It was kind of vague really of what content you are trying to make here. Are you planning of using a static page on each of the number or the pages here will be dynamic? I got an impression that the body/context of the page would have similar format with the title and that the content will be the same except for the values.

10% popularity Vote Up Vote Down


Report

 query : SSL Labs Key exchange < 100%, why? How do I make it? I am in progress of trying to get maximum (or almost maximum) score on SSL Labs my site: I currently don't understand why it does

@Hamaas447

Posted in: #Https #SecurityCertificate

I am in progress of trying to get maximum (or almost maximum) score on SSL Labs my site:

I currently don't understand why it does not give me full Key exchange score.

Now, it looks like this:



In spite of, I have generated 4096 DHParameters file, and tried many other things, like buying new certificate of 4096 size.

So, my question for you will be:

SSL Labs Key exchange is less than 100%, why? And how do I make it full?

OS: GNU/Linux Debian 9.2 with Apache 2.4.

10.01% popularity Vote Up Vote Down


Report

 query : Re: Does using too many heading tags (h1s, h2s, and h3s) cause SEO problems? On my website, I utilize many h1, h2, and h3 tags. I am most worried over the fact that I utilize the h1 tag way

@Hamaas447

On my perspective, I don't think Google imposes a strict rules on the number of heading tags to use in a page. There are also websites that use heading tags (h2/h3) for subheadings on longer content - it's good for both the user and search bots.

10% popularity Vote Up Vote Down


Report

 query : What's the difference between WordPress hosting and other hosting plans? I aim to use WordPress to build an e-commerce website, however it's not clear if the hosting plans offer genuine difference

@Hamaas447

Posted in: #WebHosting #Wordpress

I aim to use WordPress to build an e-commerce website, however it's not clear if the hosting plans offer genuine difference or it's just marketing to make a plan seem more advantageous.

For example:

1&1 provide WordPress hosting and a Linux plan.

Are there any advantages in terms of site performance, flexibility, etc between the plans?

10.02% popularity Vote Up Vote Down


Report

Back to top | Use Dark Theme