Category Archives: software

wake up Windows automatically using a scheduled task

This seems easy at first, but it turns out to be a lot harder than expected, due to various OS and power settings that get in the way. The idea is simple: you want to have your Windows 10 machine start up at a specific time using a task set in the task scheduler.

When you create a new task in the Task Scheduler, there’s a setting for this, so it looks so easy. It’s on the conditions tab. You activate the option to wake the computer to execute your task, and you’re done. Right? Well, if you’re really lucky that might work straight out of the box.
In case it doesn’t work, here are some things you can fiddle with to try and get it going.

  1. Create a wake-up tasks that doesn’t really do anything. Just run a command like this:
    cmd /c echo %date%
    For a more detailed guide on how to set up a task and the wake timers check this excellent article.
  2. Create a second task that does what you want it to do in a batch script but 10 minutes later than the first. That gives your computer ample time to start up, install any potential updates, or do whatever it sometimes does that takes so damn long.
  3. Deactivate Fast Startup. It messes with your hibernation mode and causes it not to startup again afterwards.
  4. Always hibernate your machine. If you just do a plain shutdown it doesn’t seem to automatically start up again. You can do this with the command below. This is also handy if you want to shut your machine down again, after your task has finished:
    shutdown /h
  5. Enable wake timers in your Power Options advanced settings. See the link below for instructions.
  6. For laptops make sure your Power Options are configured correctly. Mind closing the lid in that case. A closed lid will prevent the laptop from starting up in my case for example. Laptops will also not startup when they are running on the battery by default.
  7. Reboot your machine after fiddling with these settings if it still doesn’t work. This makes sure your settings stick.
  8. If you tried everything and it still doesn’t work, check your BIOS settings and see if there might be a power option in there that might prevent it from starting up automatically.

Give your machine a few minutes if you’re testing this. Yes, it’s annoying to wait, but if you set it too fast, it might not be powered down fast enough to be able to start up again, and miss the timer. That way you won’t be able to tell if it’s actually working.

Hopefully you can now use your desktop or laptop to run some nightly scheduled jobs, without having to have a dedicated always-on machine around. Saves you time and money on power consumption, hardware and maintenance!

automatically delete emails with IMAP Cleanup

Let’s say you have this IOT device like a motion detection camera. Which sends you emails. Emails you keep in a separate IMAP mailbox. Lots of emails. So you want to delete those emails in some automated fashion, because doing that daily is oh so boring (remember, lots of emails).

Wouldn’t it be great if there was like some handy command line tool that would delete the oldest emails and keep like a 1.000 or 500 of them only? Well yes, that way I could script that annoying job and run it daily.

I didn’t find something that already did this. So I figured I’d be able to whip something up in an hour or so using PowerShell, or maybe a small .NET console application using an existing IMAP library.

Well, 3 different IMAP libraries and about 4 hours later I did have something rudimentary that finally did what it was supposed to do. Delete the oldest email, and leave the most recent 1000 (or whatever number you want) behind. That took longer than expected, so to regain as much time spent on this as possible I threw the ImapCleanup tool on GitHub, including binary downloads. I hope someone else will find this useful as well.

Beware though. This tool deletes emails. Be careful which mailbox you point this at, and make sure you test it in advance on a dummy mailbox. Maybe your email server behaves differently than mine, and important emails get digitally shredded by mistake.

publish a static website to Azure using GitHub actions

Last post I talked about setting up a serverless website on Azure using BLOB storage. What I didn’t go into is how to publish files to that site automatically. Yes, you can use the BLOB explorer to manually upload your files but seriously, who wants to do that kind of boring task if you can let computers do that for you.

Instead, what I do to publish this excellent programming guidelines website is the following:

  • I make my changes locally and test it.
  • I commit & push my changes to the master branch of my git repository.
  • A GitHub action kicks in and published the site to the Azure BLOB container.

How sweet is that? Pretty sweet, I know. How do you set this up? Well let me take you through the steps my friend, and automated Git deployment will soon be yours to enjoy as well.

  • You need to create a Git repository on GitHub.
    Now that you can create unlimited private repositories, you don’t even have to expose it to the public, which is nice.
  • Clone the repo locally, and create a source/ directory in it. This is where the source code will go, and that’s what we’ll push to the Azure BLOB container. Any other files you don’t want published go in the root, or in other folders in the root of your repository.
  • Copy your source code into the source/ folder, or create a simple index.html file for testing the publishing action.
  • Go to your repository page on the GitHub site, and click the Actions tab at the top.
  • Click New Workflow, choose “set up a workflow yourself”.
  • It will now create a YAML file for you containing your workflow code.
  • Paste the content for your YAML file listed below. Notice the “source” folder in there? That indicates what folder will be copied to Azure.
    In case you run into trouble, you can dig in to the Azure Storage Action setup yourself, but it should do the trick.
on: [push]
jobs:
  build:
    runs-on: ubuntu-latest
    steps: 
    - uses: actions/checkout@v1
    - uses: actions/setup-dotnet@v1
      with:
        dotnet-version: '3.0.100'
    - uses: lauchacarro/Azure-Storage-Action@v1.0
      with:
        enabled-static-website: 'true'
        folder: 'source'
        index-document: 'index.html'
        error-document: '404.html' 
        connection-string: ${{ secrets.CONNECTION_STRING }}
  • Last step is to set up that CONNECTION_STRING secret. This is the connection string to your Azure storage container. You can set the secret from your GitHub repository Settings tab, under Secrets.
    Click New Secret, then use the name CONNECTION_STRING and paste the access key value from your Azure storage account.

You’re all set up!
To test your publishing flow, all you need to do now is push a commit to your master branch, and see the GitHub action kick in and do its thing.
You should see your site appear in a few seconds. Sweet!

Update: recently I found out the workflow broke because of a bug in the latest version of the action. To bypass this I now fixed the version in my workflow YAML file to v1.0, which still works. It’s probably a good idea to avoid this kind of breaking changes by fixing the version of any action you use in your GitHub actions anyway. It will avoid those annoying issues where things work one day, and don’t the next.

how to host a serverless static website on azure

For my little gfpg project I wanted to put a simple static website online without having to set up and maintain a web server. I read about going serverless with a static site using S3 on AWS, but I wanted to try that on Azure instead. BLOB storage seemed the obvious alternative to S3, but it took some searching around and finding the right documentation on MSDN to get it all up and running.

If you’re on a similar quest to publish some static content to Azure BLOB storage as a serverless website, this short guide will help you along.

  1. First of all we need to create an Azure BLOB storage account for the site. The most important part is to choose a general-purpose v2 Standard storage account, for the account kind. This is the only type that supports hosting a static website. Guess who didn’t do that.
  2. Next thing is to enable static hosting of your files. This will create a $web folder in your storage account, which will be the root folder of your website. It’s that simple.
  3. Copy your files into the $web folder using the Storage explorer blade in the Storage account menu, or the Storage explorer app. You can already test your site using the Azure endpoint.
The Storage explorer is a quick and easy way to upload and manage your files in the BLOB storage account.

You can stop here if this is a personal project and you don’t need HTTPS support or a custom domain. In my case, I did want to go all the way, so here’s how to get that working as well.

  1. Get a domain name. Make it sassy ;). Make sure your domain registrar allows you to edit the CNAME records for your domain. This is pretty standard, but not all cheap web hosters allow this and you need it later on to hook up your domain to Azure.
  2. Set up an Azure CDN endpoint for your static site. I picked the Microsoft CDN option which is the most basic one, so you don’t need any accounts with a third party CDN provider.
  3. Now you can map your custom domain to your Azure CDN endpoint using a CNAME record.
  4. Create an HTTPS certificate for your site on Azure with just a few clicks. I was afraid this was going to be hard but it’s so damn easy it’s beautiful. There really is no excuse anymore to let your site just sit there on HTTP these days.
  5. Last thing to do is set up some caching rules for the CDN. We don’t want to be hitting the “slow” BLOB storage all the time and use the faster CDN instead. Depending on the option you chose for the CDN this will differ, but if you picked the Microsoft one you have to use the Standard rules engine to set your caching rules. If you picked Akamai or Verizon, you can use CDN caching rules instead.
    For a simple setup on the Microsoft CDN, go to the CDN settings Rules engine page, and set a global cache expiration rule to override and an expiration you like.
    After a few minutes you’ll see the cache header appear in your HTTP requests.
  6. Here you can also create a rule to redirect HTTP traffic to HTTPS, so people don’t accidentally hit the insecure version.

One more tip on the CDN. You can also purge the CDN cache after you pushed an update to your site to apply the changes, before your CDN cache expires. This is handy if you’ve set a rather big expiration time, because you don’t expect the site to change very often.

From the CDN account, you can purge content on a specific path, or everything at once.

use a PowerShell script as a Vim filter to encode a url

Artistic closeup of a .vimrc file

Vim filters are cool. They let you run the content of your current buffer through a command and have its output returned into the bugger. That means that you can use filters to edit text using any command your operating system has available.

Now on a Linux machine that’s quite handy. On a Windows machine that isn’t so handy, because the default shell is CMD and that doesn’t have all those handy Unix text manipulating utilities. But what about a PowerShell script that would encode or decode a URL for example?

It turns out that wasn’t as easy as I expected, so I’ll spill it here in case you’re looking to do something similar.

First calling the script from Vim means you have to call powershell.exe because by default Vim uses cmd.exe as its shell. That isn’t so hard:

nnoremap <Leader>dc :%!powershell.exe -noprofile -nologo -file c:\tools\decode-string.ps1<CR>
nnoremap <Leader>ec :%!powershell.exe -noprofile -nologo -file c:\tools\encode-string.ps1<CR>

I create 2 shortcuts here to encode (ec) and decode (dc) trigger using the leader key.
These will pipe the content of your buffer through the script, and get whatever it spits out through the standard output back into your buffer.
The extra command line parameters I pass into powershell.exe are to make it as fast as possible and not do any unnecessary junk. The -noprofile is the most important one as this skips loading any special modules you have set up in your PowerShell profile.

Reading from the pipe in a PowerShell script wasn’t as easy as I thought. Using the ValueFromPipeline attribute on the input parameter didn’t work for some reason. After some searching I came across a magical $input parameter that gets anything you receive from standard input for use in your script. Handy.
So the script was as simple as doing this:

[Reflection.Assembly]::LoadWithPartialName("System.Web") | Out-Null
[System.Web.HttpUtility]::UrlEncode($input)

What do we do when PowerShell can’t help us out of the box? That’s right. We summon the powers of .NET assemblies to get the job done. In this case, we’re summoning System.Web and using the HttpUtility class to encode the incoming data.
We do exactly the same to decode text by the way:

[Reflection.Assembly]::LoadWithPartialName("System.Web") | Out-Null
[System.Web.HttpUtility]::UrlDecode($input)

This is pretty powerful. By using the $input variable in PowerShell together with the Vim filter function, you can do all sorts of text transformations. This can make your developer life a lot easier.

fight corona with folding@home on a headless ubuntu linux box

I wrote about using Grid computing to fight cancer a while back using leftover Azure credits.
So now with Corona having the world in its grip, it’s time to shift our attention to that nasty virus and throw some CPU cycles at it instead.

If you have an Ubuntu machine running somewhere and you want to install the Folding@Home client on it, you can do so by following the simple steps listed below.

First download the FAH client application:

wget https://download.foldingathome.org/releases/public/release/fahclient/debian-testing-64bit/v7.4/fahclient_7.4.4_amd64.deb

Then, run the installer. It will guide you through the setup process.

sudo dpkg -i --force-depends fahclient_7.4.4_amd64.deb

Just choose a name and an optional team. You best let it start automatically as well.

That’s basically it. Since you’re running it all headless, there’s no need for any of the other packages listed on the site.

You can check if things are running by checking the log file stored in /var/lib/fahclient/log.txt

If you want to change the configuration after installing the client in /etc/fahclient/config.xml you have to start and stop the client like this:

sudo /etc/init.d/FAHClient stop
sudo /etc/init.d/FAHClient start

If you had the World Grid Computing client installed already, you can stop that by using:

sudo /etc/init.d/boinc-client stop

If you want to full install instructions for the FAHclient or installation instructions for other flavors of Linux, check out the official installation guide.
Once it’s running and processed a few work units, you can check your stats at this URL: https://stats.foldingathome.org/donor/yourname

Have fun kicking Corona ass!