DFS Server Data Hard Drive Upgrades

[sc:hardware-category ]I recently upgraded the memory in my servers and now it’s time to upgrade the data volumes as well.

When I built the servers, I invested in 2tb hard drives for the data volumes and used DFSR to replicate the data between the servers.  This ensured I had multiple copies of the data and protected against hardware failure in a single server.

I had a close call a few years ago when my single server’s power supply decided to die.  Luckily it was a nice quite death and it didn’t take the motherboard or drives out with it, but it certainly was at the front of my mind when I built the new twin servers to replace it.

I was down to about 150GB of space left on the 2TB drives and so it was time to replace them with something larger (I could have just added another drive, but I’m already at capacity for the SATA ports on the MB so it was just easier to replace it).  The new drive I chose was the Segate Green 6tb drive.

My DFS servers are VM’s and since I am dedicating the entire drive to them I directly connected the 2tb drives to the VM instead of having the overhead of creating a VMDK on them.  I wasn’t sure what would happen when I just swapped out the drives so I decided to play it safe.

First I shutdown the first DFS VM, then I removed the old drive and disabled networking on it.

I then powered down the VM host, swapped the drives and restarted the host.

Once it was powered back up, I took the new drive offline, added it to the VM and restarted the VM without any network connection.

Once up, I connected to the VM, created a new partition on the drive, gave it the same drive letter and then created the basic folder structure the old drive had.  Windows still knew about the shares and they became available as soon as I had created the folders for them.

I then brought the server back on to the network and waited to see if replication would start.

That turns out to have been a little too optimistic 🙂

DFSR relies on information contained in the “System Volume Information” (SVI)directory on the disk, which of course was missing on the new drive.  DFSR interprets that to mean that it hasn’t communicated with the server in a long time and refuses to replicate with it.

My first instinct was to remove the server from the replication group and then add it back in, making sure to force an AD poll with dfsrdiag.exe.  But that didn’t seem to work, I found some reference articles talking about deleting some of the information in the SVI directory but that seemed risky.

I decided to simply delete the replication groups and then re-create them, again forcing an AD poll.

After a few minutes, replication restarted and files began to appear on the new drive.  Of course replicating 2tb of data takes a while and DFSR isn’t the fastest, but it does work and after a few days everything was back in place.

Of course, I then had to do the same thing on the second server but that went smoothly and both servers now have their new 6tb data volumes.

DisplayPort and Windows Positions

[sc:hardware-category ]I’ve been running my 4K monitor for a while now and I connected it through DisplayPort.  This was the first time I’ve used DisplayPort (my old monitor was connected through HDMI) and whenever I turned off the monitor all my window positions reset to the top left corner.

This was strange but hunting around the next for a while I found some information that basically indicated it was because Windows treats DisplayPort devices just like any other hot plug device and when you turn off the monitor it effectively “removes” the display from Windows.

Of course this means Windows drops back to its default display driver, which happens to be 1024×768, hence all the windows shrinking to fit in that display size.

Microsoft has a KB on the issue, but in the end the options are basically:

  1. Live with it
  2. Let the monitor go to sleep instead of turning it off
  3. Shutdown the system when you want to turn the monitor off
  4. Don’t use DisplayPort

At the moment I’ve opted for #2, just letting the display go to sleep which seems to work alright.  My only concern is with power outages if I didn’t have the monitor on the a UPS,  but I have both the monitor and PC on an UPS so that shouldn’t be a problem.

I may just go back to HDMI, it works fine with 4K displays and doesn’t appear to have the issue.

Dell 28 Ultra HD Monitor – P2815Q

[sc:hardware-category ]Back on Cyber Monday, the Microsoft Store had a sale on the Dell P2815Q 4k display and I decided to pick one up.

I’d been looking for a 4k monitor and at $299 for the 28″ Dell it was hard to argue with the price.  It arrive a few days later and unpacking it was easy enough.  It comes with a stand, power cord and DisplayPort cable.

The first thing I did was plug it in to my existing desktop PC using my existing HDMI cable.  It has an older video card in it (no display port on it) and I didn’t expect it to be able to support the 4k resolution of the monitor but I did expect it to do something.  Unfortunately while I could access the on-screen controls for the monitor, it stubbornly refused to display my desktop 🙁

My first thought was just that the video card couldn’t match the monitor and so I grabbed my laptop and plugged it in.  Same thing.

Of course my next thought was a dead display, but that seemed unlikely so I too a gamble and swapped out my HDMI cable (which still works fine with my old HD monitor) and suddenly I had video!

Video at 1920×1200 of course as that was all my old video card could handle.  A quick trip to the local computer shop and I was the proud owner of a AMD R7 250 card with DisplayPort.  I swapped the card, plugged in the cable suddenly had nothing on the display again!

It turns out the Dell doesn’t support auto detection of the input type and I had to manually change the setting.  Not a big deal, but kind of a strange oversight.

After installing the new video card drivers  I had a full 4k display up and running.

The monitor itself is a stylish affair, the main body is very dark grey with four physical buttons along the bottom right side.  The base stand is light grey and support rotating the monitor from landscape to portrait mode.

I have to admit I REALLY like my old Samsung monitor’s 16×10 aspect ratio, 16:9 just seems to wide, but it does have some advantages when you have multiple documents or websites up at the same time.  I think I’ll get use to it in time.

Of course, fitting 4 times the resolution of my old monitor in to something only ~20% bigger makes all those pixels a lot smaller.  Windows does support 4k, but not very well.  The scaling works ok, but not perfectly so I’ve set the scaling to 125% which seems to be the best compromise at the moment.  Hopefully Windows 10 will have better support for 4k next year.

The color saturation is a little pale and I’m still playing with the settings.  The default’s are terrible like most monitors and are designed to show off the display in a brightly lite store.

Overall I think it’s a little early for most people to adopt a 4k monitor, but I’ll stick with it for a while and the extra resolution is very nice 🙂

Messeging Apps: Threema

[sc:software-category ]There is a plethora of messaging apps out there these days but most of them are tied/owned to big social networks of some kind.

I recently talked a bit about Bleep but development of it has been pretty slow.  I found Threema in the Windows Phone Store and decided to give it a go.

First things first, it’s not free, but that’s to be expected.  Threema focuses on security and that pretty much negates a “free” business model that involves advertising as their revenue stream.

It is cross-platform, so you don’t have to have Windows Phone to use it, though there is no Windows version.

Setup is simple, they use a random sample of swiping to generate and private/public key pair and then your off t the races.

Threema is both secure and can be anonymous as well as it doesn’t require you to enter your phone number of e-mail address as part of the sign-up process.

Of course you have to find someone else who has Threema to message them but that isn’t a big hurdle since the app is only $1.99.

Adding contacts is perhaps the only weak point I found with Threema.  A friend downloaded it and I scanned the QR code for him.  It added him to my contacts and so I sent him a message.  On his end, there wasn’t an obvious way to add me to his contacts (without scanning my QR code) but after fiddling around a bit we found you could start a new chat with me and then add my to the contacts list from there.

Threema supports group chats and delivery and read receipts and just about everything else you might want in a messaging app so I’m going to see if I can get a few more people on it and give it a good work out.

I’m still interested to see what happens with Bleep, but it may be a while before anything happens with it and even longer still before there is a Windows Phone version.

Upgrading OpenVPN Access Server to Ubuntu 14

[sc:linux-category ]In my last post I talked about my OpenVPN Access Servers and a problem I was having, while working on that I also noticed that they were still running Ubuntu 12.

A while ago I upgraded my Ubuntu server through the in place upgrade process and so I was reasonably comfortable with it.  However as this was the a VM I hadn’t built but instead downloaded from OpenVPN, I decided to take a look around and see if there were any gotcha’s with it.

A search didn’t turn up anything and overall there was a real lack of information on the OpenVPN site.  In the end I decided to simply take a snapshot of my backup node and go through with the upgrade process.

I won’t go in to detail of the upgrade process, you can read my previous post for that, but it went smoothly and after I restarted the server, OpenVPN came up as well.

Of course I needed to test the backup node, which means taking down the primary node.  My first instinct (which in the case was wrong) was to simply shutdown the OpenVPN service on the primary node.  That doesn’t work because UCARP doesn’t actually monitor the service on the primary node, but instead just the IP address.  I decided the simpler way to just shutdown the whole server.

Once down, the backup node took over the services and everything was fine.

I simply repeated the process on the primary node and both functioned as expected.