The "No Dev Question is Stupid" Thread - Basic HTML / CSS / Etc.

JasonSc

BuSo Pro
Joined
Mar 30, 2016
Messages
97
Likes
137
Degree
0
Are there any negative issues with having a large htaccess file?

I'm working on a site and the slugs are basically KW stuffed. When all is said and done, the htacess file would have over 500 301 redirects plus the normal caching code.

I'm not worried about the internal linking because I can fix that, but this site has a lot of backlinks and I don't want to waste any juice and provide a good user experience.

Relatively speaking it's a low volume site, 100 visits a day.
 

Rageix

BuSo Pro
Joined
Jan 11, 2017
Messages
94
Likes
103
Degree
0
Are there any negative issues with having a large htaccess file?

...

Relatively speaking it's a low volume site, 100 visits a day.
On one hand, yes, it's a negative as Apache has to parse that .htaccess file every page load. On the other hand, no, the site only gets 100 visits a day.

Probably fine for now, but if you up traffic to it, you should really clean that up.
 
Joined
Dec 17, 2015
Messages
174
Likes
98
Degree
0
New question guys (and girls)...

I have experience in a certain niche selling customised products. My other sites use Opencart but I find it a bit of a nightmare so wanted to use wp this time for a new site. Basically I need customers to buy a product and be able to upload their image to go on the product. Then they go through to checkout. Is this easy to do? I looked at a few plugins but wondered if you had any experience with this.

Cheers
 
Joined
Apr 5, 2017
Messages
131
Likes
86
Degree
0
What commands should I use to delete everything inside the "public" directory (/srv/users/serverpilot/apps/pizza-site/public) except .htaccess and robots.txt?

I will be using this as a scheduled cron task.
 

SmokeTree

Developer/Linux Consultant
BuSo Pro
Digital Strategist
Joined
Sep 23, 2014
Messages
225
Likes
409
Degree
1
What commands should I use
Since this is just 2 files, you're probably best off just creating a tmp dir, moving the 2 files to it, deleting everything in the public dir then copying the files back. Here's a very simple bash script that should do the trick (Back up your files first).

Code:
#!/bin/bash

SITE_DIR='/srv/users/serverpilot/apps/pizza-site/public'
TMP_DIR='/tmp/pizza_site_bak'

rm -rf $TMP_DIR
mkdir $TMP_DIR

cp $SITE_DIR/.htaccess $TMP_DIR
cp $SITE_DIR/robots.txt $TMP_DIR

rm -rf $SITE_DIR/*

cp $TMP_DIR/.htaccess $SITE_DIR
cp $TMP_DIR/robots.txt $SITE_DIR
 

built

Gotta get it before its too late
BuSo Pro
Boot Camp
Joined
Jan 23, 2015
Messages
1,599
Likes
1,374
Degree
4
Question:

If I use @font-face for a custom font, do I have to upload a bold version and an italic version too?
 

Ryuzaki

女性以上のお金
Staff member
BuSo Pro
Digital Strategist
Joined
Sep 3, 2014
Messages
3,911
Likes
7,439
Degree
8
Question:

If I use @font-face for a custom font, do I have to upload a bold version and an italic version too?
Yes and No. It depends on if you want to trust the browser to thicken the lines (it'll look different in each browser) or if you want to provide it for design consistency.
 

Joe

Joined
Apr 25, 2015
Messages
156
Likes
96
Degree
0
What commands should I use to delete everything inside the "public" directory (/srv/users/serverpilot/apps/pizza-site/public) except .htaccess and robots.txt?

I will be using this as a scheduled cron task.
Code:
#!/usr/bin/env bash
shopt -s extglob
cd /srv/users/serverpilot/apps/pizza-site/public
rm -rf -- !(.htaccess|robots.txt)
 

darkzerothree

DunkelNullDrei
Joined
Feb 16, 2017
Messages
485
Likes
331
Degree
2
You can also use find for this

Code:
find . ! -name ".htaccess" ! -name "robots.txt" -maxdepth 1 -type f -delete
 
Joined
Apr 5, 2017
Messages
131
Likes
86
Degree
0
My cdn uses the subdomain, "www", which uses the primary domain as the source.

Previously, I was using the primary domain as the primary site- but I am finding little reason to have my visitors connect to my server in the first place when I can direct them to the "www", which saves me resources.

Am I missing out on any caveats (search, technical) that I should take note off? I have pointed everything to the "www" site properly and if a visitor goes into the primary site, all the links in it would go to a "www" version of it.

Essentially, it's a static cdnized copy of the primary static site.

Additionally, since the cdn is a mirror of the primary site, there a way that I could still pull from the primary site (or redirect to) automatically if the www, cdn hosted site is down?
 
Joined
Apr 5, 2017
Messages
131
Likes
86
Degree
0
Update: for some odd reason, after changing all the pointers to "www", using links on the "www" site lead to the primary domain site. This is even after checking that the hrefs have "www" in them.

This is incredibly strange and I cannot figure out why. Reverting back to original for the moment.
 
Joined
Apr 5, 2017
Messages
131
Likes
86
Degree
0
@built not if you properly declare your canonical links. That and if all your directory links point to one version of the site, it helps prioritize that one for search. A CDN essentially duplicates your entire site and supports a website by helping to serve the assets (images, Javascripts, CSS) that the CDN duplicated.

That means while the first ping to your site and the html is hosted by your server, the assets that are in the html are replaced with the CDN assets, which takes the load off your server.

I wanted to go one step further and explore using the CDN as the main front end (no touching of my server except by the CDN), but stranger things happened on KeyCDN.

I'm not sure if it's a typical CDN behavior that I don't know or problem I should reach KeyCDN for. The CDN is pulling from a completely static site.

Maybe regular CDN wouldn't work for this, I am uncertain. I saw surge.sh allows you to directly push your static site to them and they will host it for you via command line for $0 and the cost of independence.
 

Cash Builder

BuSo Pro
Joined
Jan 14, 2017
Messages
337
Likes
369
Degree
2
I have a list of sites, and I want to find out what articles on these sites contain my keyword. I would like a list of all the articles.

What I am doing just now is searching each site individually with a site:mydomain.com 'keyword' search on google. As you can imagine this is incredibly time-consuming.

There is probably an easy solution, is this what Scrapebox is used for? Or could I code something in Python? (once I learn Python)
 

built

Gotta get it before its too late
BuSo Pro
Boot Camp
Joined
Jan 23, 2015
Messages
1,599
Likes
1,374
Degree
4
I have a list of sites, and I want to find out what articles on these sites contain my keyword. I would like a list of all the articles.

What I am doing just now is searching each site individually with a site:mydomain.com 'keyword' search on google. As you can imagine this is incredibly time-consuming.

There is probably an easy solution, is this what Scrapebox is used for? Or could I code something in Python? (once I learn Python)
Yes you can use scrapebox for that
 

bernard

BuSo Pro
Joined
Dec 31, 2016
Messages
756
Likes
574
Degree
2
Can you recommend some books on webDESIGN? Not the technical aspects, but design principles, color schemes, breaking up space with various content types etc.
 

Robin

I ain't Robin
BuSo Pro
Joined
May 3, 2016
Messages
30
Likes
35
Degree
0
I have a simple external JSON script which contains some basic information that I want to "output" on my website. It has multiple "properties?" (probably not the right word to use), but I am trying to figure out how to load and "loop" until all the properties have been loaded e.g. into a table?
In my head it seems relatively simple, however, I cannot figure out how to get it started.

Any hints to get me started is very much appreciated. :-)

Code example:
Code:
[{"id":"123","time":"2017-12-18 12:12:12","message":"{user} text here","params":{"user":"Robin"},"boat":{"id":"551","name":"Wave Racer 1000"}},{"id":"456","time":"2017-12-18 12:12:12","message":"{user} text here","params":{"user":"John"},"boat":{"id":"552","name":"Wave Racer 2000"}}]
 
Joined
Oct 18, 2017
Messages
21
Likes
6
Degree
0
I created a Wordpress instance using Bitnami on Amazon LightSail, but now I have no idea how to set up a domain to use the Wordpress install. All I see is a default Wordpress install when I visit the IP address that the Bitnami stack created.

How do I set up my domain in LightSail/Bitnami so I can host my blog there?
 

Ryuzaki

女性以上のお金
Staff member
BuSo Pro
Digital Strategist
Joined
Sep 3, 2014
Messages
3,911
Likes
7,439
Degree
8
I just Googled this while waiting on my ride to show up. I've not done this myself. But what I found sounds like you have a two step process.

You won't just punch in domain name servers at the registrar. You'll have to set up "A records" that point the domain to the IP address. And then to get the site running on the domain instead of the IP address, you'll have to edit the wp-config.php via an FTP client, changing out the values here from the IP to the domain name:
Code:
define('WP_HOME','http://example.com');
define('WP_SITEURL','http://example.com');
 
Joined
Apr 7, 2016
Messages
291
Likes
190
Degree
1
I changed the slug of a page in wordpress and now it redirects the old url to the new one. I don't want this to happen because I want to replace an archive page to take the old urls place.

Ex) old slug "/reviews/" was changed to "/reviews-test/"
I have a new custom post type archive that I want to take its place "/reviews/"

I was using the the redirection plugin
Code:
https://redirection.me/
and it wasn't set to monitor page url changes, even if it had it should have shown in the redirection list however, there was nothing there. I checked my .htaccess file and there was nothing.

I cleared everything out of the plugin and deleted it. I also turned off all other plugins, including Thrive Architect and Yoast SEO.

Still having issues with this redirect. It's probably something obvious that I'm overlooking.
 

turbin3

BuSo Pro
Joined
Oct 9, 2014
Messages
615
Likes
1,274
Degree
3
When changing slugs in Wordpress, a meta_key of "_wp_old_slug" is added to the wp_postmeta table in your database. This is a built in function of Wordpress.

The function "wp_old_slug_redirect" references "_wp_old_slug" and uses it to redirect to the new slug.

I'd check your wp_postmeta table and see if you can find a meta_key of "_wp_old_slug" with a meta_value of your old one. If you find it, just delete that row and the redirect should be disabled.

If you need help checking the table, let us know. Most hosts, especially shared hosts and hosts with Cpanels usually have something like phpMyAdmin available for this.
 
Joined
Apr 7, 2016
Messages
291
Likes
190
Degree
1
When changing slugs in Wordpress, a meta_key of "_wp_old_slug" is added to the wp_postmeta table in your database. This is a built in function of Wordpress.

The function "wp_old_slug_redirect" references "_wp_old_slug" and uses it to redirect to the new slug.

I'd check your wp_postmeta table and see if you can find a meta_key of "_wp_old_slug" with a meta_value of your old one. If you find it, just delete that row and the redirect should be disabled.

If you need help checking the table, let us know. Most hosts, especially shared hosts and hosts with Cpanels usually have something like phpMyAdmin available for this.
Thank you very much! I didn't know wordpress had that built in. I just recently learned SQL, so this worked out perfectly. My host has something similar to phpmyadmin that I was able to query.
 
Joined
Jun 24, 2015
Messages
27
Likes
16
Degree
0
I got a low pingdom result on posts with a lot of comments.

"Remove query strings from static resources" and it seems that gravatars are the problem. I managed to solve other page speed problems with plugins but not this one. Is there any code that can help me?
 

Cash Builder

BuSo Pro
Joined
Jan 14, 2017
Messages
337
Likes
369
Degree
2
I am trying to run Ahrefs site audit on my Wordpress site, but the crawl fails every time. Ahrefs support says it is because their bot encounters a captcha on my site which prevents them from crawling.

My web host says that they are not blocking any bots, and according to their logs Ahrefs bot is landing on my site fine.

I also noticed that when I put my site into Ahrefs site explorer instead of showing my site's title, it just says Captcha.

I have tried deactivating plugins but it's still not working.

Any ideas on what could be causing this?