Newbie Question(s) so dumb, you're afraid to even ask!

I installed a History/Activity plugin and saw that a variety of different IP addresses keep trying to log into my Wordpress. I get that bots will do that but they are attempting to log in with my exact username that I use to log in. Has anyone experienced this before? I assume to just make a new admin account and delete that one? That shouldn't have any impact on posts or anything, right?
 
Is there an issue with using the same image in multiple articles?
No, it's not a problem. The same images get used time and again across the entire internet. This is what stock photos are for. Hasn't been an issue yet. The one caveat being the Product Reviews updates that look for that in product reviews.

I installed a History/Activity plugin and saw that a variety of different IP addresses keep trying to log into my Wordpress. I get that bots will do that but they are attempting to log in with my exact username that I use to log in. Has anyone experienced this before? I assume to just make a new admin account and delete that one? That shouldn't have any impact on posts or anything, right?
They can get your admin account name in several ways. It'll be in your Wordpress author URL. Every author login name is exposed in your REST API urls. The best thing you can do is use strong, unique passwords.

You can also add a 2-Factor login so that they can't even reach the actual login page without first getting past the initial authentication. In the Security Day of the Crash Course I talk about .htpasswd that shows you how I manage to do it. This also stops the inadvertent DDOS-ing that can come from brute force hack attempts on unprotected login pages.
 
I've made all external links as nofollow by default using Rankmath. I now want to make one external link dofollow. The problem is I have other links to the same website which I want to keep as nofollow (they're affiliate links).

Is there a way to do this in Rankmath? I know they have an option to whitelist domains, but that'll make all the other links to that website dofollow as well.
 
After reading this thread about A.I. generated content getting slapped (it was a matter of time, really) I came across something others said and was not aware of.

And I quote:

I also have instant indexing which Google said not to use unless you have a news site

Is indexing a new article through Google's Search Console right after posting frowned upon? Or does this have to do with Rank Math and Google's API whatever-whatever (something I have just discovered and am not interested about).

I may be wrong, so don't quote me on it, but I think I remember Mr.Media said in his AMA thread that he would index articles through GSC after posting.

Would you advise against that?
 
Does anyone know any keyword research tools that integrate into google search console that make finding keywords easier? I already know of keywords everywhere and query hunter. They're ok but is there anything better?

I'm wondering if there are tools that can dig out good keywords for which I haven't written an article on.
 
Quick question:
Do you guys ever do articles that are competitive but, your main goal is to be an authority in a specific cluster or niche?? Or is this waste of time/money?
 
Quick question:
Do you guys ever do articles that are competitive but, your main goal is to be an authority in a specific cluster or niche?? Or is this waste of time/money?
You'll have to target competitive terms eventually regardless. With this said, the goal should be to rank and get traffic. You're better off targeting a term that receives an estimated 1,000 searches each month and ranking #1 than targeting a term that receives an estimated 10,000 searches each month and ranking #12. Start with the low competition keywords first to get the ball rolling, then move on once they've been exhausted.
 
Quick question:
Do you guys ever do articles that are competitive but, your main goal is to be an authority in a specific cluster or niche?? Or is this waste of time/money?
Absolutely.

I think this a major point that a lot of people miss with these "guides" to ranking.

Everyone tells you to pound KGR keywords until you're fucking blue in the face; All the while they're not interlinking properly, sending links to any part of their silos, or doing anything on-site that establishes you as an authority in a niche - schema, topical authority, authorship, etc.

I mean I'm personally guilty of being awful at internal links and I've heavily focused on it this year and it's helped massively - not to mention using ZORA from @CCarter and @eliquid. Which I also plan to use wayyy more in 2023.

From the horse's mouth in regards to internals: https://www.searchenginejournal.com/internal-linking-critical-for-seo/441381/

TL;DR: It's important and you should be strategic when doing it.

Here's John also on the "amount" of content on a site in terms of being authoritative: https://www.reddit.com/r/SEO/comments/qq6vkb/less_than_50_of_new_articles_are_indexing/

"It's really hard to call a site authoritative after 30 articles, and especially if you've stopped publishing for a while."

If anything you can think of those "bigger keywords" as content gaps in your "topical map." If you mention something in a lot of your articles, there's a good chance you have the authority to write about a bigger keyword.

The way I always look at "bigger" keywords too is that the goal isn't to rank right now - which a lot of people struggle with; You're writing an article that may net you nothing in 2 or 3+ years.

Just to illustrate:

On a personal blog I author, I mentioned a particular "thing" 77 times out of 158 of my posts. I knew early on I'd mention that particular thing and it's related to another thing so I added to the other things top-down silo - all while stacking internals.

Let's look what happens.

Here's the bigger keyword:
thing.png


Would you look at that Batman.

Peaks and valleys are pretty normal with bigger keywords as their algorithm figures out if you should be ranking - there are patents on this that I believe Bill Slawski or Koray GÜBÜR have gone over ad nauseam at this point.

https://www.seobythesea.com/
https://www.holisticseo.digital/

You can feel free to find them yourself, you'd probably learn a bunch along the way.
 
How do you handle duplicate content on your own website?

For example, let's say I write posts about sport. On posts about running and posts about cycling I want to mention the importance of good warming up.

So I have a paragraph:
- Why is warming up so important.

Would it hurt to use the paragraph about 'why warming up is so important' in multiple posts?

Basicly you could do two things:
- Write a seperate posts about warming up in general and link to it from each post.
- Include paragraph in each post (i.e. why warming up is important by running and why is warming up important by cycling). You can modify the paragraph a little bit to fit the topic better.

The second way will make the article stronger in my opinion, but you use the same paragraph in two seperate articles. What are your thoughts on this?
 
I have a dumb question.

Assume that a Site A has a post named as https://www.siteA.com/post-slug-one/ and this page contains an image (think infographics).

If other sites links to that image (not the post), will the backlinks influence the ranking of that page?
 
How do you handle duplicate content on your own website?

For example, let's say I write posts about sport. On posts about running and posts about cycling I want to mention the importance of good warming up.

So I have a paragraph:
- Why is warming up so important.

Would it hurt to use the paragraph about 'why warming up is so important' in multiple posts?

Basicly you could do two things:
- Write a seperate posts about warming up in general and link to it from each post.
- Include paragraph in each post (i.e. why warming up is important by running and why is warming up important by cycling). You can modify the paragraph a little bit to fit the topic better.

The second way will make the article stronger in my opinion, but you use the same paragraph in two seperate articles. What are your thoughts on this?

I write about topics where there are things I have to write about over and over again and never had an issue.

However, I never copy and paste from one article to the other. I write it again and again. Because, naturally, I will use different phrasing each time so it will be unique.

In your example, let's imagine you have three articles: one about running (where you have a section about warming up), one about cycling (where you have a section about warming up), and another article about the importance of warming up in sports.

In your article about running, in the section about warming up you could say something like:

"Most beginner runners make the mistake of not warming up. Even though running is a pretty straightforward activity, you still want to make sure you do it safely. Warming up properly will decrease the risk of injury and also increase your performance.

[a paragraph or list about warm-up drills for running]

If you want to read more about the importance of warming up, check out this article here."

And in your article about cycling, in the section about warming up, you may write something like:

"Before you start cycling, we recommend you to do a quick warm-up. Warming up before cycling will not only decrease the likelihood of injury but will also prepare you mentally. A quick warm-up will increase your heart rate and blood flow, allowing your body to produce energy more efficiently.
[a paragraph or list about warm-up drills for cycling]
You can read more about this topic in our article about the importance of warming up in sports."

Same topic, different phrasing, both linking to the article that talks about that particular topic at length.
 
Last edited:
How much value does it have for a sites ability to rank to let it 'sit' for a while? I've got a site that's been sat in Google's index for over 3 years, has about 5 articles and never had a link built to it. But just having a look on UberSuggest it's in the top 100 for some reasonably competitive terms. Guessing if I start adding content and build a few links, I should see traction a lot quicker than with a new domain?
 
How much value does it have for a sites ability to rank to let it 'sit' for a while? I've got a site that's been sat in Google's index for over 3 years, has about 5 articles and never had a link built to it. But just having a look on UberSuggest it's in the top 100 for some reasonably competitive terms. Guessing if I start adding content and build a few links, I should see traction a lot quicker than with a new domain?
The domain has aged, which helps. But, each individual post must also age. So, it this domain better than a fresh domain? Yes, technically. Will you still have a sandbox to deal with? Yes.
 
Quick random questions:

1. I noticed this site which is doing good on SERPs in a particular niche. It seems on each of their post they link back to their homepage/domain. Anchor is their brand. Do you think this is smart idea?

2. They have a section that says "Internet Appearences" with a whole bunch of directory listings. But it seems like they aren't a local business. Do you think this is a good strategy to get more backlinks/rank higher? Not only that, why did this guy create a page like this? Does it help some sort of SEO or what?

3. Noticing #1. I think on my old paste site, near the end I wrote something like if it was a product review, I'd write product review and I'd link to that same exact page with anchor being product review. You think this is bad or good practice?
 
It is quite hard to add my keywords to the text when I write, so in most cases I just try to add it to the title, intro, h2 and conclusion. Is it bad to use keywords that little like 3-4 times in a text that is 2-3-4k long?
 
Quick random questions:

1. I noticed this site which is doing good on SERPs in a particular niche. It seems on each of their post they link back to their homepage/domain. Anchor is their brand. Do you think this is smart idea?

2. They have a section that says "Internet Appearences" with a whole bunch of directory listings. But it seems like they aren't a local business. Do you think this is a good strategy to get more backlinks/rank higher? Not only that, why did this guy create a page like this? Does it help some sort of SEO or what?

3. Noticing #1. I think on my old paste site, near the end I wrote something like if it was a product review, I'd write product review and I'd link to that same exact page with anchor being product review. You think this is bad or good practice?
This is all bullshit. Don't ask any other questions until you've written and published 150 articles on your new website and let them age for 3 months.
 
Does embedding a page of your website on multiple WEB 2.0 have any effect? I read that in some scenarios Googlebot indexes the content inside an iFrame.
 
I get a 403 error for an external link on Screaming Frog. However, if I visit the URL (direct or via Google) the page is accesible. Am I missing something here? Is this still a problem?
 
I get a 403 error for an external link on Screaming Frog. However, if I visit the URL (direct or via Google) the page is accesible. Am I missing something here? Is this still a problem?
A 403 error is a “forbidden” code, meaning they’re probably just blocking common bot crawlers and scrapers. That’s why you can still get through.
 
A 403 error is a “forbidden” code, meaning they’re probably just blocking common bot crawlers and scrapers. That’s why you can still get through.
Is it a problem to link out to such URLs?
You mean only crawlers like SF notice this and it's not a problem for big G?
 
Last edited:
Is it a problem to link out to such URLs?
You mean only crawlers like SF notice this and it's not a problem for big G?
You can look up ways to crawl it as Googlebot. There are websites that can fetch the page for you using that useragent or you can set it up in your browser yourself. I'm certain if they're blocking bots they know enough to not be blocking Google. Search the URL and see if it's indexed and check the Google cache and see what Google sees!
 
You can look up ways to crawl it as Googlebot. There are websites that can fetch the page for you using that useragent or you can set it up in your browser yourself. I'm certain if they're blocking bots they know enough to not be blocking Google. Search the URL and see if it's indexed and check the Google cache and see what Google sees!
Seems like they are blocking Screaming Frog. Just refreshed the page as Googlebot and seems to load fine. Thanks!
 
Another question. I ran the lighthouse speed test. Speed is perfect, but I got the following:
"No csp found in enforcement mode".

After looking on Google I am still not sure; is this a problem? If yes, how to solve this? I use a standard theme on WordPress, so not self-build website or something.
 
Back