logo topseosmo.com

@Turnbaugh106

Turnbaugh106

Last seen: Tue 11 May, 2021

Signature:

Recent posts

 query : How Can I Filter Out Sessions to This URI? This is probably a complete newbie question (ok, it is a newbie question). I created a view for my site to just show members who have not registered

@Turnbaugh106

Posted in: #GoogleAnalytics

This is probably a complete newbie question (ok, it is a newbie question).

I created a view for my site to just show members who have not registered yet. This is pretty easy to do as non-members visit the site at * and members visit at app.*

However, I want to filter out sessions from users who visit *.com/login/
I've set a filter to exclude the request URL of /login/, but when I check the real time reports I still see sessions on this URI.

10.01% popularity Vote Up Vote Down


Report

 query : Re: Should I redirect login / admin pages to the home page? I'm on a security kick right now and my current project is a server with a Wordpress site (that used to be a Joomla site). I sometimes

@Turnbaugh106

I wrote an article a couple of years back and it's worth a read, especially the section about "Creating a Strong Complex Password"


10 ways to stop Brute Force attacks in WordPress

Many WP plugins will stop brute force hackers in their tracks however
because most plugins work on banning IP addresses after X attempts
they may not stop determined password crackers using multiple IP
blocks in the thousands.

Most plugins lift bans after X minutes meaning with enough IP
addresses you become pretty much immune to these bans unless you opt
to use an extremely long duration on those bans. Administrators
generally do not use long duration bans because they don’t want to get
locked out themselves.


A lot of the current answers are way over complicated

Normally I would suggest fail2ban or hiding wp-login.php but 50 attacks is nothing! with a decent password it would take them many decades. Personally I would keep things simple and just install... Loginizer for WordPress.

It is used by millions of sites and by installing it, it will automatically ban bad users by IP address for a few hours, a day, weeks, months or even years. Also, it will show you how many attacks it has blocked.

10% popularity Vote Up Vote Down


Report

 query : Re: Using rel="nofollow" on a form that uses/links to Google Search page? I have a form that uses/links to Google's Search page to search for the input within my site. I have read other posts

@Turnbaugh106

The attribute nofollow is for:


Paid Links
Untrustworthy Links


The attribute nofollow is not for:


Shaping internal page flow
Shaping external page flow
Trusted relevant external sites


In short... if the link is to a website that is trusty, then there is no need to use nofollow, also worth mentioning that Google or Bing are not going to use your form, in any case.

10% popularity Vote Up Vote Down


Report

 query : Re: HTTP-header canonical and link tag canonical on same page? I've come across websites using both a HTTP-header based canonical as well as a tag-based canonical. Sometimes they are not referencing

@Turnbaugh106

Since Googlers mention this issue on one of their blog post:


Specify no more than one rel=canonical for a page. When more than one
is specified, all rel=canonicals will be ignored.


5 common mistakes with rel=canonical

Google algorithm can can ignore both of them if they point to different urls

10% popularity Vote Up Vote Down


Report

 query : Re: My blog post lost all rankings after posting the same content on Medium and don't have it back after removing the Medium post I copied my article to a medium post, I lost all my ranking.

@Turnbaugh106

Wait, wait and wait some more

Simple you wait... until Google drops it the page from it's index which can take several days or even weeks.

But...

It's unlikely that medium had any influence on your rankings drop since Medium’s publication and cross-posting pathways automatically add canonical links to protect your original content posted offsite. This means that Medium can only boost — not cannibalize — your SEO.


SOURCE

Search engines use canonical links to determine and prioritize the
ultimate source of content, removing confusion when there are multiple
copies of the same document in different locations. Sites that publish
an over abundance of duplicate content without indicating a canonical
link may be penalized in search engine rankings.

Medium’s publication and cross-posting pathways automatically add
canonical links to protect your original content posted offsite. This
means that Medium can only boost — not cannibalize — your SEO.


Canonicals work... most likely something else

If your rankings have dropped, I'm willing to bet that it has nothing to do with Medium and your pointing fingers in the wrong direction, but to answer your question, Google does take several weeks for them to full drop articles because unless using error 410 (GONE), they have to ensure that 404 is not temporarily.

Finally, in regards of N/A, just so you know:


SOURCE

N/A is a common abbreviation in tables and lists for the phrase not applicable, not available, or no answer. It is used to indicate when information in a certain table cell is not provided, either because it does not apply to a particular case in question or because the answer is not available.


My point is... N/A does not mean zero.

10% popularity Vote Up Vote Down


Report

 query : Re: Why would you use document.location.protocol instead of plain // prefixed urls? For instance Google Analytics uses document.location.protocol in the boilerplate for tracking: <script type="text/javascript">

@Turnbaugh106

Indeed, it was not an oversight by the GA Team!
GA loader loads a script, so that's not affected by the double-download bug on a <link> or @import for a stylesheets in IE7/IE8.

They use the conditional (ternary) operator on document.location.protocol because of an edge-case bug in IE6 that causes a security-dialog to pop up under certain security settings when requesting from the non-'ssl' subdomain,
as explained by Paul Irish (who worked together with the Google Analytics javascript lead developer on this matter) on his blog: www.paulirish.com/2010/the-protocol-relative-url/ from which I quote below:


2011.01.23: But... what about using this on the Google Analytics snippet?

Yes of course, wouldn't that be nice... So I worked with the Google Analytics javascript lead developer (God, I love working at google) to see if we could do this... turns out we can't. There is an edgecase bug in IE6 that causes a dialog to blow up... under some security settings (unsure if they are default) when requesting from the non-'ssl' subdomain. screenshot here. So feel free to take 40 bytes off your GA snippet if you don't care about IE6.. otherwise you're gonna need that ternary operator. `:)`

2011.12.24. Eric Law (from the IE team) chimes on why IE6 doesn't play well GA...

The reason this doesn't work in IE6 is that the server is using SNI to deduce what certificate to return. XP (and thus IE6) doesn't support SNI in the HTTPS stack. See for details.

10% popularity Vote Up Vote Down


Report

 query : Re: Why does google only index a part of my site? For a while now I have been having issues with google indexing my site. For some reason it would'nt take over 33 pages from my xml sitemap:

@Turnbaugh106

I appreciate this is old but there is a consideration that is easily overlooked.

If you are dynamically creating your sitemap (say php for example). A site can get stuck on 1 url indexed from the sitemap if you don't format the url correctly.

echo'<url><loc>'.$SITE.'/asection/acategory/somepage.html</loc><lastmod>2018-01-14</lastmod><priority>0.7</priority></url>';


Simple enough, right?

now lets say our variable is:

$SITE = 'https://www.icalculator.info/';


All good right? Wrong... when the php is parsed the result is a url like this:
www.icalculator.info//asection/acategory/somepage.html

Note the double // after the domain. This is one of those very silly but incredibly easy mistakes to make. I do occasional SEO reviews for friends / odd contract. the amount of times I have seen that mistake. Anyway, shared now so hopefully if you have the dreaded one page from your sitemap showing in Google Webmaster tools, you can check this.

10% popularity Vote Up Vote Down


Report

 query : Re: Using an old domain with new content reflects SEO? I have searched for my question on google but i didn't found an answer, i have a deals website (e-commerce) but actually it's not working,

@Turnbaugh106

The main issue that you will encounter is that existing backlinks can become non-revelant which in time can become detrimental to your rankings.

Nowadays it is better to seek out a old domain that has on-topic links or one that has few links to none.

Relevancy is everything! in this day and age .

10% popularity Vote Up Vote Down


Report

 query : Re: Google indexing my site with https So I Want to Redirect it over https but Giving 404 I don't know why Google is indexing my wordpress website with https even I have not installed any SSL

@Turnbaugh106

The reason that Google is indexing and not is because within your WordPress general settings you have your site address, url address or both set to HTTPS. Since your site uses Yoast SEO which handles canonical links the plugin is adding <link rel="canonical" href="https://www.tipstricksisland.com/" /> to your pages which tells Google, Bing and other search engines that they should index this and that.

To correct the issue simply login to the WordPress dashboard and click General Settings then change the URL of both WordPress and URL, see picture below (replace with your URL without the HTTPS).



Then simplify your HTACCESS using something like this:

# REDIRECT SSL TO NON-SSL
RewriteEngine On
RewriteCond %{HTTPS} on
RewriteRule (.*) %{HTTP_HOST}%{REQUEST_URI}
# ADD WWW to NON-WWW
RewriteEngine On
RewriteCond %{HTTP_HOST} !^www.
RewriteRule ^(.*)$ www.%{HTTP_HOST}/ [R=301,L]

# BEGIN WordPress
<IfModule mod_rewrite.c>
RewriteEngine On
RewriteBase /
RewriteRule ^index.php$ - [L]
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
RewriteRule . /index.php [L]
</IfModule>
# END WordPress


Once you have done the above simply view the source of you pages to confirm that HTTPS is now within the canonical links. Then it's a matter of waiting for Google to change the address, this process can take several weeks to take place.

10% popularity Vote Up Vote Down


Report

 query : How/Why does Google return a result when Gibberish input is given using another language's keyboard? I'm studying Russian. As such, my laptop (Lenovo T460 thinkpad if it matters) is set up to

@Turnbaugh106

Posted in: #GoogleSearch #Language

I'm studying Russian. As such, my laptop (Lenovo T460 thinkpad if it matters) is set up to switch between English and Russian when I hit alt shift.

Well, being the human I am, I occasionally don't think to switch back to English. So, in the searchbar, I'll hit the keys for "Y, O, U, T, U, B, E". Only, my Cyrillic keyboard is still enabled. So, instead of English, the real search is "нщгегиу". (н shares a key with y, щ shares a key with o, and so on.)

Instead of returning gibberish like you'd think, Google (Chrome) still returns YouTube. Screenshot

This isn't limited to Youtube, I've found it working for all of my searches. Does google search using your default computer language? Does it do some weird fuzzy logic to figure out what's more likely? It seems silly to set up dozens of Hidden Markov Models to check the language each and every search...

10.01% popularity Vote Up Vote Down


Report

 query : Re: Do cryptocurrency webminers in a blog violate adsense policies? JavaScript based webminers are used to mine cryoto currencies by using visitors computing power on several sites. Is that illegal?

@Turnbaugh106

Google's Adsense policy does not touch upon Crypto Currency Browser miners directly but it is clear that either malware or anything else that interferes with site navigation is considered a breach of their terms of use, therefore webminer scripts are likely to breach their terms of use policy.


Site behaviour

Sites showing Google ads should be easy for users to navigate. Sites
may not change user preferences, redirect users to unwanted websites,
initiate downloads, include malware or contain pop-ups or pop-unders
that interfere with site navigation.


Crypto Currency Browser miners are not Malware nor do they interfere with navigation...

At this point I would expect you to immediately stand up straight in your chair and want to throw out justifications that its neither malware nor will it interfere with site navigation but hear me out first!


Site Navigation


While its true that you could set the CPU time of the web miner to low so that users won't even notice a performance drop it should be dearly noted that a slow computer or one that becomes stressed becomes even more stressed and therefore site navigation could be affected due to the performance of the computer. Google could use this as a justification to terminate or suspend your account.

Malware


AVAST! has already classified Coin Hive webminer as Malware and as a result they have blocked the script running, a discussion about this can be found on Reddit. More will likely follow suit.
Malwarebytes has been blocking the original Coinhive API and related proxies an average of 8 million times per day, which added up to approximately 248 million blocks in a single month.
Google Chrome, the team at Google Chrome are already thinking of adding a block feature into their browser, you can find out more about this on bleepingcomputer.



Not Eco-friendly

While hardware manufacturers are constantly looking at ways of reducing energy consumption, web miners unnecessary directly increase the TDP of the CPU which in turn has a direct impact of power used and increases one's power unity bill, however this will likely run into a few pence depending on the settings of the web miner.

Stigma...

The problem that you face is the stigma of web miners, since they can set to aggressive they are unlikely never going to be accepted. The other issue is Google and Bing may see Web miners as direct competition vs their ad networks, because if websites can make money from Crypto, sites won't need ads.

Google and Bing alone can punish websites from using their adnetworks or worse, they can purposely apply a SEO penalty of those sites running them, ultimately the future of browser web mining is in the hands of Google, Bing and Anti Virus Software Applications.

My Opinion?

Not that it counts... any site that I visit that runs a crypto miner on my machine will join my ban list, hundreds of thousands of people and IT minded people are developing hardware and software solutions to block these scripts.

Most browser families already have dozens of add-ons that block them, sorry but there's a BIG difference than clicking an optional ad then a non-optional crypto miner.

I would however change my stance on this idea if webmasters had a 'opt in' option.

10% popularity Vote Up Vote Down


Report

 query : Re: Does a broken link within conditional comments have any effect on SEO? I have many sites with this old code which is intended to direct IE6 users to get a newer browser or to get Google

@Turnbaugh106

IE7... was released in 2006

Google does not care about outdated browsers, it focuses on majority and majority doesn't use a browser released 11 years ago.

IE7 is older than most cars on the road in the UK

If you want to cater for users with browsers older than most cars on the road in the UK then you should change the HTML code so it includes a new link or no link with just the message. It should be noted that most IE7 users will encounter millions of websites that do not render and most of these sites will not inform the user to upgrade their browser.

IE7 has 263 vulnerabilities

It's extremely unlikely that 'real' users will visit your site using IE7, it has over 263 known vulnerabilities and personally I wouldn't bloat my code for these few users.

Traffic

With all this said it boils down to how many users visit the site, if you receive millions of unique visitors a month then the 'amount' of people using IE7 increasing therefore it may be worth while having, but for the majorty of sites with hundreds to a few thousand visitors a month, it'll be extremely rare to see a spike in IE7 user agents lurking in the log, if they do they could also be 'naughty' bots.

10% popularity Vote Up Vote Down


Report

 query : Clicking on a link in Google Analytics shows me data for the link without the / at the end, which is incorrect Whenever I click on a link in G Analytics, it automatically redirects to show

@Turnbaugh106

Posted in: #GoogleAnalytics #TrailingSlash #Url

Whenever I click on a link in G Analytics, it automatically redirects to show me the stats for the link without the obligatory / at the end.

Which means it shows me almost nothing, since I have WordPress and all my links have / at the end.

Has anyone else experienced this? Is there a way to solve the problem?

If I click back, it doesn't work because it redirects again to the silly reports.

If I click back more, I get an error.

Here's an example:


An illegal state has occurred.

The request resulted in an inappropriate state. Reload the page to see whether the problem persists.

Error ID: c67296d9-bb43-4417-9230-2f07f25ca50f

10% popularity Vote Up Vote Down


Report

 query : Re: Why Google is indexing Thumbnails and not Large images? I have a page full of thumbnails, when clicked it opens the Full size image in new tab. The problem is that Google is indexing the

@Turnbaugh106

The issue is that Google only sees one ALT tag therefore the other image is without description and will not rank in image search results.

If using a lightbox then preferably you should code or use one that supports HTML5 data- e.g:*

<div class="thumb">
<img src="example.jpg" data-src="example-thumbnail.jpg" alt="example">
</div>


If you are using only CSS and HTML then you COULD use one of 3 methods that I can think of:



Method 1: Scrap the thumbnails and resize the larger image down to thumb size:

<!-- HTML -->
<div class="thumbnail">
<img src="example.jpg" alt="example">
</div>

/* CSS */
.thumbnail img {
max-width: 200px;
}
.thumbnail img:hover {
max-width: 100%;
}


You could even spice this up by using a Pure CSS lightbox



Method 2: Link to a valid page rather than a image:

<!-- Embedded Small -->
<a href="/path/to/example.html" title="View Image Full Size">
<img src="example.jpg" alt="thumbnail of example">
</a>




Method 3: Show and Hide:

<!-- Both Images -->
<div class="thumbnail">
<img src="example.jpg" alt="example">
<img src="example-thumbnail.jpg" alt="example thumbnail">
</div>

/* CSS */
.thumbnail img:first-child {
max-width: 200px;
}
.thumbnail img:last-child {
max-width: 100%;
display: none;
}
.thumbnail:hover img:first-child {
display: none;
}
.thumbnail:hover img:last-child {
display: block;
}

10% popularity Vote Up Vote Down


Report

 query : Re: Do descriptive over basic anchor links have any bearing on SEO? I have a page containing several different products, one after the other, each one on a separate line. Each product has a unique

@Turnbaugh106

When is the last time you noticed a site being indexed with a 'HASH' within the URL, the answer is never. Google treats the hashes within a URL to be HASH Fragments and they do not use them to rank a site.

To get around this issue your pages should be accessible without the HASH fragments, a quick example of this looks like this:

example.com/location#bournemouth <--- users example.com/location?bournemouth <--- bots example.com/website#seo <--- users example.com/website?seo <--- bots example.com/website#design <--- users example.com/website?design <--- bots


Also you could opt to use HASHBANG e.g

www.example.com/#!design/ <--- users www.example.com/?_escaped_fragment_=design/ <--- Bots


The above code is just an example to give you an idea and you can think up your own structure.

Also SEF (Search Engine Friendly) Urls ain't really that friendly, Google and Bing put little weight into the actual algorithm, since its a 'content signal' that with a good page can be found elsewhere, they used to be given a lot more weight but since most sites use it, its hardly a factor. URLS should be made for your users, not search engines.

10% popularity Vote Up Vote Down


Report

 query : Re: How do I configure gzip to work on External Resources In order to do this I put code to .htaccess: <ifModule mod_gzip.c> mod_gzip_on Yes mod_gzip_dechunk Yes mod_gzip_item_include file .(html?|txt|css|js|php|pl)$

@Turnbaugh106

You have no control on expires, compression or any other header property on external resources.

Rather than trying to satisfy pointless speed score websites you should focus on how quickly your pages load. Google, Bing and your users DO NOT CARE what your code looks like or how your website loads the code, what they care about is the end result.

If your website has F Score and loads in 1 second, but another site has B Score but loads in 1.5 seconds, whom do you think search engines and your users are going to prefer? Around 95% of the speed of a website is down to the hosting, unless you have serious configuration errors or unconventional use.

I recommend that you spend more time on matrix's that actually matter, the outcome. Using Web Page Test you can target your audience's region and do multiple passes. I recommend doing 10 passes from multiple locations at the same time as one another.

10% popularity Vote Up Vote Down


Report

 query : Can reducing page size impact seo? We have e-commerce website where we list all the products on our infinite scroll search page. The user can filter/sort these products. Currently, we are fetching

@Turnbaugh106

Posted in: #InfiniteScroll #Seo

We have e-commerce website where we list all the products on our infinite scroll search page. The user can filter/sort these products.

Currently, we are fetching 20 products on mobile and 25 products in desktop on each page request and fetch others as the user scroll down. We wanted to optimize the performance and we thought of reducing the page size on page load i.e. load only 5 products on page load and if user scrolls down then fetch 10, if he still scrolls till bottom, fetch another 20 products and so on.

But I am worried that google bot will see only 5 product instead of 20 product? But we have put rel next and prev to indicate linkages in these pages.

What could be best approach to solve this:


Reduce page size and fetch more products only if user scrolls down.
Reduce page size and lazy load more products after few seconds. Would it help in seo ranking?

10.01% popularity Vote Up Vote Down


Report

 query : Is there any threshold page load time to get spike in the conversion? I was reading an article here where I found very interesting correlation of conversion rate and page load time. I can't

@Turnbaugh106

Posted in: #Conversions #Optimization #PageSpeed #Performance

I was reading an article here where I found very interesting correlation of conversion rate and page load time. I can't get 2 things here:


Why graph is almost flat from 5 seconds to 11 seconds page load time and sudden spike when we move from 4 seconds to 1 second? Is <2 second page load is real threshold to solve this issue?
Why in data conversion rate has dropped for page load time of 11 seconds to 13 seconds while it goes up for 13 to 15 second? Is it wrong data?

10% popularity Vote Up Vote Down


Report

 query : Re: how to deal with URLs containing underscores I know that hyphnes are preferred to underscores, and I could do a 301 redirect. But some say it's not worth the chaos to change them and I pretty

@Turnbaugh106

It's not a major problem if you have some underscores, Google and Bing have come along way since they first launched. If you want to be politicly correct then you can rewrite underscores to hyphen by editing the .htaccess with something like this:

Options +FollowSymLinks
RewriteEngine On
RewriteBase /

RewriteRule !.(html|php)$ - [S=6]
RewriteRule ^([^_]*)_([^_]*)_([^_]*)_([^_]*)_([^_]*)_([^_]*)_(.*)$ ------ [E=underscores:Yes]
RewriteRule ^([^_]*)_([^_]*)_([^_]*)_([^_]*)_([^_]*)_(.*)$ ----- [E=underscores:Yes]
RewriteRule ^([^_]*)_([^_]*)_([^_]*)_([^_]*)_(.*)$ ---- [E=underscores:Yes]
RewriteRule ^([^_]*)_([^_]*)_([^_]*)_(.*)$ --- [E=underscores:Yes]
RewriteRule ^([^_]*)_([^_]*)_(.*)$ -- [E=underscores:Yes]
RewriteRule ^([^_]*)_(.*)$ - [E=underscores:Yes]

RewriteCond %{ENV:underscores} ^Yes$
RewriteRule (.*) www.example.com/ [R=301,L]


If you would rather use PHP then you can rewrite the URLS using a replace e.g using: $input_uri = $_GET['rewrite_uri']; and $output_uri = str_replace("_", "-", $input_uri); etc.

10% popularity Vote Up Vote Down


Report

 query : Create FB custom audience from user URL I have a list of many Facebook users with URLs like: facebook.com/firstname.surname facebook.com/pseudo facebook.com/profile.php?id=12345678910xxx1112 How can I

@Turnbaugh106

Posted in: #Advertising #Facebook #FacebookApplication #FacebookGraph

I have a list of many Facebook users with URLs like:

facebook.com/firstname.surname
facebook.com/pseudo
facebook.com/profile.php?id=12345678910xxx1112


How can I create a Custom Audience with these people in www.facebook.com/ads/manager/audiences?
Here I don't see "User URL" field (I tried to upload a CSV with the user URL and it doesn't work):



If it's definitely not possible, how to get the other fields that are ok from my users URL list, so that I can feed the audience creation tool with these other fields?

10% popularity Vote Up Vote Down


Report

 query : Re: Discrepancy between Search Console and Structured Data Testing Tool I have 200+ errors flagging in Search Console > Search Appearance > Structured Data These are all hentry errors that say 'Missing:updated'.

@Turnbaugh106

Yes, you should trust the structured data testing tool from Google over Google Search Console (GSC).

I once had 11m+ pages receive the structured data error in GSC.

It took 5 months for GSC to catch up (after we fixed the error).

10% popularity Vote Up Vote Down


Report

 query : Using Google Analytics to track my content in iframe on third-party sites? I have built a small visualisation and I want to allow third-party users to embed it within an iframe on their pages.

@Turnbaugh106

Posted in: #Analytics #Embed #GoogleAnalytics #Iframe

I have built a small visualisation and I want to allow third-party users to embed it within an iframe on their pages.

When they do this, I would like to use Analytics to track (i) how many people have seen the visualisation and (ii) the URLs on which it has been embedded.

The HTML and JS of my visualisation are both on the same domain, but obviously my domain is different from the domains on which it has been embedded.

I have read Google's documentation on iframes but it isn't helping me a great deal!

Two questions:


Can I just use the basic Analytics setup in the normal way in my HTML, or are there any issues I need to be aware of?
How do I pass in the parent URL as a custom dimension? Presumably I can't just get the URL of the parent page, for security reasons, so I'm wondering if I need to do something like this?

10.01% popularity Vote Up Vote Down


Report

 query : Re: Should stop words such as "and" and "in" be omitted from URLs for SEO? I have read before, it's recommended to don't use And, In and The in the URL, because this is will effect on the

@Turnbaugh106

The use of 'stop' words have never worked negatively unless it was considered excessive word spam... in the late 90's they were treated as noise and to some degree ignored. Thankfully times have changed and Google looks at common words completely different than it did almost 2 decades ago. I highly recommend SEO guides written in the past 2 years.

URL's should be kept short but not at the expense of users not understanding the content before clicking... For example if you had a page about the movie 'The Game' it would be better to use /the-game/ rather than /game/.

Let's pretend for a moment that you went out on Saturday night to a bar called Hacker Bar, this should look something like this:


/night-out-at-hacker-bar-september-2017/ GOOD
/hacker-bar-images-september-2017/ GOOD
/hacker-bar-september-2017/ GOOD
/hacker-bar-images/ GOOD
/pictures-of-me-having-a-night-out-at-hacker-bar-back-in-september-2017/ POINTLESS


The point I'm trying to make, is there's no secret formula to making a good URL, a good url is what tells users the information they need to know before clicking, its also a very highly subjective topic.

Consider URL's as a short summary, the title and meta description is where you fill in the blanks and tell users the full description.

10% popularity Vote Up Vote Down


Report

 query : Re: HTTPS. Solution for "This page is trying to load scripts from unauthenticated sources" from third-party sources I've got a website using third-party JavaScript scripts, from Google AdSense (to show

@Turnbaugh106

You can try this:-

<meta http-equiv="Content-Security-Policy" content="upgrade-insecure-requests" />


Or

<meta http-equiv="Content-Security-Policy" content="block-all-mixed-content" />


Paste it in <head>...</head> tags.

The HTTP Content-Security-Policy (CSP) block-all-mixed-content directive prevents loading any assets using HTTP when the page is loaded using HTTPS.

All mixed content resource requests are blocked, including both active and passive mixed content. This also applies to <iframe> documents, ensuring the entire page is mixed content free.

The upgrade-insecure-requests directive is evaluated before block-all-mixed-content and If the former is set, the latter is effectively a no-op. It is recommended to set one directive or the other – not both.

10% popularity Vote Up Vote Down


Report

 query : Re: Google adwords keywords combinations (does it make a difference) I am wondering if Google Adwords treats keyword combinations listed below differently? Option A: Option B (combination of the

@Turnbaugh106

I would assume they do since you are talking about two different target keywords.

Teenager

Psychological Help + Teenager

are two different searches and are likely to have two completely different costs associated with them. Generally the more specific the keyword is the cheaper the buy will be and the likelihood that you are matching what a user is searching for is greater.

10% popularity Vote Up Vote Down


Report

 query : Re: Placing LegalService schema on homepage vs. contact page I am marking up schema.org for a law firm site, the site is a basic 7 page site detailing a law firm that has multiple locations. My

@Turnbaugh106

I would make a page for each location and have the markup for each individual location on its own page so that each location has a canonical page that you can use.

10% popularity Vote Up Vote Down


Report

 query : Re: How to prevent Alexa and Google from indexing our stuff? My traffic is not from search engine and I want to prevent bots or others from displaying or accessing it. What would be a good .htaccess

@Turnbaugh106

All of those htaccess parameters are just requests, Google etc can still crawl your site since it is on the internet and publicly available. You should try to define what you wnat to allow them to crawl in a robots.txt file

This might help yoast.com/x-robots-tag-play/

10% popularity Vote Up Vote Down


Report

Back to top | Use Dark Theme