This seems easy at first, but it turns out to be a lot harder than expected, due to various OS and power settings that get in the way. The idea is simple: you want to have your Windows 10 machine start up at a specific time using a task set in the task scheduler.
When you create a new task in the Task Scheduler, there’s a setting for this, so it looks so easy. It’s on the conditions tab. You activate the option to wake the computer to execute your task, and you’re done. Right? Well, if you’re really lucky that might work straight out of the box. In case it doesn’t work, here are some things you can fiddle with to try and get it going.
Create a wake-up tasks that doesn’t really do anything. Just run a command like this: cmd /c echo %date% For a more detailed guide on how to set up a task and the wake timers check this excellent article.
Create a second task that does what you want it to do in a batch script but 10 minutes later than the first. That gives your computer ample time to start up, install any potential updates, or do whatever it sometimes does that takes so damn long.
Always hibernate your machine. If you just do a plain shutdown it doesn’t seem to automatically start up again. You can do this with the command below. This is also handy if you want to shut your machine down again, after your task has finished: shutdown /h
Enable wake timers in your Power Options advanced settings. See the link below for instructions.
For laptops make sure your Power Options are configured correctly. Mind closing the lid in that case. A closed lid will prevent the laptop from starting up in my case for example. Laptops will also not startup when they are running on the battery by default.
Reboot your machine after fiddling with these settings if it still doesn’t work. This makes sure your settings stick.
If you tried everything and it still doesn’t work, check your BIOS settings and see if there might be a power option in there that might prevent it from starting up automatically.
Give your machine a few minutes if you’re testing this. Yes, it’s annoying to wait, but if you set it too fast, it might not be powered down fast enough to be able to start up again, and miss the timer. That way you won’t be able to tell if it’s actually working.
Hopefully you can now use your desktop or laptop to run some nightly scheduled jobs, without having to have a dedicated always-on machine around. Saves you time and money on power consumption, hardware and maintenance!
Let’s say you have this IOT device like a motion detection camera. Which sends you emails. Emails you keep in a separate IMAP mailbox. Lots of emails. So you want to delete those emails in some automated fashion, because doing that daily is oh so boring (remember, lots of emails).
Wouldn’t it be great if there was like some handy command line tool that would delete the oldest emails and keep like a 1.000 or 500 of them only? Well yes, that way I could script that annoying job and run it daily.
I didn’t find something that already did this. So I figured I’d be able to whip something up in an hour or so using PowerShell, or maybe a small .NET console application using an existing IMAP library.
Well, 3 different IMAP libraries and about 4 hours later I did have something rudimentary that finally did what it was supposed to do. Delete the oldest email, and leave the most recent 1000 (or whatever number you want) behind. That took longer than expected, so to regain as much time spent on this as possible I threw the ImapCleanup tool on GitHub, including binary downloads. I hope someone else will find this useful as well.
Beware though. This tool deletes emails. Be careful which mailbox you point this at, and make sure you test it in advance on a dummy mailbox. Maybe your email server behaves differently than mine, and important emails get digitally shredded by mistake.
Last post I talked about setting up a serverless website on Azure using BLOB storage. What I didn’t go into is how to publish files to that site automatically. Yes, you can use the BLOB explorer to manually upload your files but seriously, who wants to do that kind of boring task if you can let computers do that for you.
I commit & push my changes to the master branch of my git repository.
A GitHub action kicks in and published the site to the Azure BLOB container.
How sweet is that? Pretty sweet, I know. How do you set this up? Well let me take you through the steps my friend, and automated Git deployment will soon be yours to enjoy as well.
You need to create a Git repository on GitHub. Now that you can create unlimited private repositories, you don’t even have to expose it to the public, which is nice.
Clone the repo locally, and create a source/ directory in it. This is where the source code will go, and that’s what we’ll push to the Azure BLOB container. Any other files you don’t want published go in the root, or in other folders in the root of your repository.
Copy your source code into the source/ folder, or create a simple index.html file for testing the publishing action.
Go to your repository page on the GitHub site, and click the Actions tab at the top.
Click New Workflow, choose “set up a workflow yourself”.
It will now create a YAML file for you containing your workflow code.
Paste the content for your YAML file listed below. Notice the “source” folder in there? That indicates what folder will be copied to Azure. In case you run into trouble, you can dig in to the Azure Storage Action setup yourself, but it should do the trick.
Last step is to set up that CONNECTION_STRING secret. This is the connection string to your Azure storage container. You can set the secret from your GitHub repository Settings tab, under Secrets. Click New Secret, then use the name CONNECTION_STRING and paste the access key value from your Azure storage account.
You’re all set up! To test your publishing flow, all you need to do now is push a commit to your master branch, and see the GitHub action kick in and do its thing. You should see your site appear in a few seconds. Sweet!
Update: recently I found out the workflow broke because of a bug in the latest version of the action. To bypass this I now fixed the version in my workflow YAML file to v1.0, which still works. It’s probably a good idea to avoid this kind of breaking changes by fixing the version of any action you use in your GitHub actions anyway. It will avoid those annoying issues where things work one day, and don’t the next.
For my little gfpg project I wanted to put a simple static website online without having to set up and maintain a web server. I read about going serverless with a static site using S3 on AWS, but I wanted to try that on Azure instead. BLOB storage seemed the obvious alternative to S3, but it took some searching around and finding the right documentation on MSDN to get it all up and running.
If you’re on a similar quest to publish some static content to Azure BLOB storage as a serverless website, this short guide will help you along.
First of all we need to create an Azure BLOB storage account for the site. The most important part is to choose a general-purpose v2 Standard storage account, for the account kind. This is the only type that supports hosting a static website. Guess who didn’t do that.
Next thing is to enable static hosting of your files. This will create a $web folder in your storage account, which will be the root folder of your website. It’s that simple.
Copy your files into the $web folder using the Storage explorer blade in the Storage account menu, or the Storage explorer app. You can already test your site using the Azure endpoint.
You can stop here if this is a personal project and you don’t need HTTPS support or a custom domain. In my case, I did want to go all the way, so here’s how to get that working as well.
Get a domain name. Make it sassy ;). Make sure your domain registrar allows you to edit the CNAME records for your domain. This is pretty standard, but not all cheap web hosters allow this and you need it later on to hook up your domain to Azure.
Create an HTTPS certificate for your site on Azure with just a few clicks. I was afraid this was going to be hard but it’s so damn easy it’s beautiful. There really is no excuse anymore to let your site just sit there on HTTP these days.
Last thing to do is set up some caching rules for the CDN. We don’t want to be hitting the “slow” BLOB storage all the time and use the faster CDN instead. Depending on the option you chose for the CDN this will differ, but if you picked the Microsoft one you have to use the Standard rules engine to set your caching rules. If you picked Akamai or Verizon, you can use CDN caching rules instead. For a simple setup on the Microsoft CDN, go to the CDN settings Rules engine page, and set a global cache expiration rule to override and an expiration you like. After a few minutes you’ll see the cache header appear in your HTTP requests.
Here you can also create a rule to redirect HTTP traffic to HTTPS, so people don’t accidentally hit the insecure version.
One more tip on the CDN. You can also purge the CDN cache after you pushed an update to your site to apply the changes, before your CDN cache expires. This is handy if you’ve set a rather big expiration time, because you don’t expect the site to change very often.