Categories
geek programming software tips wordpress

highlight.js on WordPress without plugins

When you frequently post code in your blog posts, it’s nice to have that code syntax highlighted for readability. On WordPress, there was this plugin to do just that, using the JavaScript library highlight.js. However, that plugin is no longer maintained, and has a security issue. So it’s time to get rid of that.

Instead of looking for yet another highlighter plugin, I thought this would be easy to integrate without a new plugin. It’s just about including a JavaScript library on your site, really.

Here’s how I got it working.

1. The easiest no-fuss, non-optimized way.

In a way, it’s enough to include the highlight.js script and call the initialization code.
So if you basically drop the code below in a WordPress Custom HTML widget, you are set. It’s best to put this code in the footer block of your site, to avoid it from slowing down your page load.

<link rel="stylesheet"
      href="//cdnjs.cloudflare.com/ajax/libs/highlight.js/11.7.0/styles/default.min.css">
<script src="//cdnjs.cloudflare.com/ajax/libs/highlight.js/11.7.0/highlight.min.js"></script>
<script>hljs.highlightAll();</script>

This will get the code from CDNJS, and run the initialization.

Simple, but not optimized. It’s loading the script & CSS file from an external domain, which is slower than loading it from your own (HTTPS handshakes etc.). Privacy related, you might also not want to inform Cloudflare (CDNJS is theirs) about who visits your site either. Or any other external CDN you target for that matter.

You might also be missing some languages that are not in the default set. You can include the script for let’s say Go or Rust by adding another script tag, but the more of those scripts tags you add, the slower your page is going to become as more resources need to be loaded.

So let’s try the optimized approach, with a bit more work.

2. The more fuss and optimized way

First, head over to https://highlightjs.org/download/ and create a custom pack for your site by including all the languages you think you’ll need. By hosting those files on your own site, you’re taking away the external domain handshaking, making it load faster, and you avoid tracking. You’ll also have a single JS & CSS file to load, instead of multiple for the different languages you want, which is another speed increase.

If you have FTP access to your site, you can upload these anywhere you like. If you only have access to your WordPress instance, you can upload the files in your Media Library, and get the direct links from there. Use the Document format for that.
Once you have the links, you can again add them to your site’s footer with the Custom HTML widget. Set the path for the files to wherever they are hosted on your site.

Another way to add the includes in your site’s footer, is by using the Head & Footer code plugin. If you don’t have this installed, it’s probably not worth the fuss, but I’m already using this for other stuff, so I might as well use it here.
This puts the code lower in the HTML page than you can get it with an HTML widget.

3. Spend hours finding the right color scheme for your site

Yeah. You know how it goes. Once you start trying out all the color themes, you’re off for hours. :)

Categories
geek internet mystuff n3wjack opensource programming software tips tools

an open source web crawler

Being a web developer, it’s often handy to crawl one of your sites and see if any links are broken, or are given plain 500 errors because something is broken.

A classic tool to do this with, is Xenu’s Link Sleuth. That tool is old though, no longer updated and it’s a pure GUI tool. Since I couldn’t find what I wanted as a ready to use command line tool, I got down and wrote my own. It took a while, but recently it became functional enough to be released as a v1 and open it up to the world as an open source tool.

So by this I present *drumroll*, the Sitecrawler command line based site crawler (yeah, I know, naming is hard).

What can it do?

  • Crawl a site by following all links in a page. It only crawls internal links and HTML content.
  • Crawls links only once. No crawling loops please.
  • Possibility to export the crawled links to a CSV file, containing the referrer links (handy for tracking 404’s).
  • Limit crawl time to a fixed number of minutes for large sites.
  • Set the number of parallel jobs to use for crawling.
  • Add a delay to throttle requests on slow sites, or sites with rate limits.

It’s written in .NET 6, so it runs on Windows, Mac and Linux. Check it out on GitHub for more details and downloads. It’s proven useful for me already, so I hope it does the same for you.

Categories
gadget gaming geek programming software tips

uninstall the HP Omen Gaming HUB

You don’t really need this piece of bloatware that comes with your HP Omen machine. It’s fast enough on its own to be able to run the latest games, and I doubt the HUB software makes an actual difference.

In fact, it might even hinder performance. I noticed something odd with it, and it turned out that every minute it was launching a PowerShell process to check if some other software was installed. Every. Minute.

This PowerShell session isn’t started with the -noprofile switch either, so that means your PS profile is loaded every time. This slows things down, certainly if you have a profile with lots of handy PS modules in it, like I have.

So I ended up uninstalling the HP Omen Gaming HUB and never looked back. That also fixed the annoying bug where the HUB disabled my Windows key, and didn’t switch it back on after quitting the game…

So, how do you install the HUB?
Pretty simple, you can use the built-in Windows settings tool to uninstall software, and search for “HP” to find it.
If you’d want it back afterward anyway, you can find it on the HP site to reinstall it.

If you do want some extra performance out of your machine while gaming, here’s a tip: close all your browsers and Electron apps. That’ll free up a ton of memory.

Use the Task Manager to see what is gobbling up memory and CPU on your PC, and close the apps. That’ll free up more resources than the Gaming HUB will ever be able to do.

Categories
geek microsoft programming software tips tools

home made progress bars and indicators in PowerShell

Sometimes you write this fancy batch script that does a bunch of stuff, and you want to have it print out some status information as it’s doing its thing.

Sometimes this fancy script is doing a lot and there’s going to be a lot of stuff printed, so it would be nice if you could overwrite the previous bit of text. Basically, you want a progress bar or progress indicator of some sort.

There are a few ways of doing this, and one involves manipulating the $Host.UI.RawUI.CursorPosition values. That needs “a lot” of code for something that you really don’t want to write a lot of code for.
There are also the oldskool typewrite control characters, however. Like the Carriage Return, `r in PowerShell, which does pretty much the same thing.

So this bit of code, for example, prints everything out on a single line, even though it’s doing that a hundred times:

100..0 | % { write-host "`r- Items to process: $($_)".PadRight(25) -nonewline; sleep -milliseconds 20 }

The magic is in this line:

write-host "`r- Items to process: $($_)".PadRight(25)

Note the “`r” at the beginning of the line. This will reset the cursor to the beginning of the current line, printing the text behind it over any text already present on that line.
Do this in a loop, and you keep writing over the previously printed text.

This also explains the PadRight() statement, which makes sure that there are spaces added to the end of the line to erase any characters left over if the previous line was longer than the current one.
This happens a number of times in this case, as we’re counting from 100 to 0. I know there are smarter ways to fix this, but this works just fine right here (KISS).

Here’s another example using the CR trick. An actual character based progress bar. Just copy-paste and run it in a shell to see the effect:

1..20 | % { write-host ("#"*$_ + "|" * (20-$_) + "`r") -nonewline ; sleep -Milliseconds 200 }; ""

The following example is a bit more complex. It displays a spinner for longer running operations using a set of characters.

# Animation object to keep state.
$global:animation = @{ Chars = "-\|/"; Ix = 0; Counter = 0 }

# Animate one step every 500 calls. Lower the number for a faster animation.
function Animate() {
    $a = $global:animation

    if (($a.Counter % 500) -eq 0) {
        Write-Host " $($a.Chars[$a.Ix])`r" -NoNewline
        $a.Ix = ($a.Ix + 1) % $a.Chars.Length
    }
    $a.Counter++ 
}

# Usage example. Call the animation in a loop. 
$largeImages = ls *.jpg -r | where { $_.length -gt 100000; animate }

There’s also the official Write-Progress PowerShell commandlet to show a progress bar on the screen. You might want to check that out too. I’m not a fan of it myself, because it tends to act strange when you scroll in your shell window, but for more complex status updates it can be really handy.

I hope this helps to make your scripts a bit more informative (or fun) when running long jobs.

Categories
geek microsoft programming software tips

asp.net cache profile location attribute

Here’s something that confused me recently. I was debugging some caching issue, and it looked like the ASP.NET (framework) site wasn’t actually using the output cache settings.

The problem was that I was looking at the response headers, and it kept showing the Cache-Control no-cache header value. When I debugged the controller however, I noticed that it didn’t always hit the breakpoint, so cache was working.

Turns out the location of the cache profile was set to server.

<add name="somepage" duration="60" location="Server" />

Setting the location attribute to server makes ASP.NET cache the output in memory of your server, and it sets the response header to no-cache. That way, proxies like Cloudflare won’t cache the response, and you can do fancy stuff in your ASP.NET app using internal data to vary your cached responses (see VaryByCustom), and have them cached in memory on your server.

The upside is you can vary your output cache using information Cloudflare or Varnish don’t know. The downside is that this means your servers will get more hits and the more servers you are using in a load balancer setup, the less effective your cache will be.

If you drop the location attribute, the Cache-Control header will be set to whatever time you have set in your cache profile, and proxies can start caching the results.