Raspberry Pi 2 - Then and Now, a Comparison

Good things come in little packages, so goes the saying. Rather than trying to cheer up a boy whose growth spurt hadn't kicked in (cough), it’s clear that statement was designed to perfectly describe the Raspberry Pi; the tiny computer that could.

Although it was primarily designed for the education market, when the Pi was first released, I like many others was incredibly keen to get my hands on one purely to tinker and see if the box really was as good as reports had led me to believe.

So what’s New?


1.jpg

It's been 3 years since the original Pi launched and seemed to single-handedly send the internet into meltdown. In the intervening years, several new versions of the board have been released, each adding small incremental improvements, with an additional 2 USB ports being added to version B+ and the RAM being doubled from 256MB (in versions A to B Rev1) to 512Mb in subsequent models. Due in no small part to the incredible demand for the Pi, it was some time before I was able to get my hands on one, picking up the model B rev 2, which came with just 2 USB ports and a friction lock SD card slot.

But the latest release, Raspberry Pi 2 model B really moves things along and bumps the Pi up at least a couple of notches in terms of performance capabilities.

2.jpg

So how do the two models compare? On paper at least, the Pi 2 should be much quicker. With double the RAM (1GB vs 512MB in my original) and a 900MHz quad-core ARM Cortex-A7 processor (vs a 700MHz single-core ARM116JZF-S)

Performance Benchmarks

To get a theoretical impression of how much difference there actually was between my Pi and the new model, I needed to run some benchmarks. Fortunately, Roy Longbottom’s Raspberry Pi benchmarks were just the thing I needed.

3.png

Whetstone

The whetstone benchmark is one of the original benchmark tests and looks at the number of calculations carried out per second (Millions of Whetsone Instructions per Second)

4.jpg

Dhrystone

The Dhrystone benchmark, (which is a hilarious pun about the older whetstone benchmark when you're a Fortran dev from the early 80s) is intended to be representative of integer programming and is testing general CPU performance. The main output is the number of dhrystones per second, with a dhrystone being the number of iterations of the main code loop per second. Basically, more is better.

5.jpg

So with these results, I should be able to handle all the tasks I currently do much more smoothly and efficiently. But more importantly, I was keen to see how much more I would now be able to achieve with the additional horsepower behind me.

Projects Tests

Media Server

The first project that sprung to mind when I thought about the new compute power at my disposal was the home theatre I have set up with the Pi. This isn't an ideal system (I can’t easily access Netflix for instance) but the incredibly low power means I can quite happily have this set up in the spare room ready to go when I want to watch something. I can then set up more powerful systems for my more regular viewing.

At present I'm using OpenElec, one of the two main XBMC/Kodi distributions that is compatible with the Pi. I've tried both and OpenElec just comes out on top as it's been slightly less prone to crashing on me. The Model B is more than adequate for a simple media server. It's not really capable of running any of the more complex theme packs, but it’s happy with posters and background art. And I've not had any problems with connecting to my NAS using SMB or using the Yatse Android app as a remote control.

The system is also quite happy playing 1080p video though it can take a little while for videos to start playing. The only real downside is that sometimes navigating menus and actually getting to the content can be a little choppy and menus don't load as quickly as I've come to expect.

With the Pi 2, these minor niggles disappeared. Menus are much snappier and the speed of loading posters and background art seemed markedly improved. This was one of the more important things for me as having a system which not only works well, but looks good is important when you're trying to win the debate about whether you need another gadget in the lounge.

Another indicator of how much of an improvement the system has seen is that I was also able to run a range of other more demanding theme packs that are available. The original Pi was essentially limited to the confluence theme for performance issues.

Programming/Desktop Use

The improved ARM v7 processer in the Pi 2 definitely makes a difference in the general usability of the desktop environment. I've run both models on Raspian, A Debian Wheezy Distro. The improvement in performance is definitely appreciable. Whilst the system isn't blazingly fast and responsive, this should never be the expectation. For the price, the performance is perfectly satisfactory.

The original Model B was interesting to use as a desktop for an hour or two, but the speed limitations meant it quickly became an exercise in frustration trying to have any kind of productivity. Tasks such as opening the browser seem to take north of 30 or 40 seconds and anything requiring more processor power often dragged on even longer.

The Pi 2 solves a lot of these problems. Boot up time is pretty much half what it was and opening and using programs feels much more responsive. Using a range of the programs that come pre-installed with Raspian as well as some others I've installed, the difference is marked.

Embedded Projects

Embedded projects are one of the few areas where if you already own and use a Pi, an upgrade probably isn't going to make that much of a difference. The kind of project where you don't intend to be tinkering regularly, or the activity you're doing isn't one that is going to demand lots from the Pi, you're unlikely to see any real benefit from upgrading. That is of course unless like me, you've got the B+ rev 1 which only came with two USB ports. There are probably a number of occasions in which those extra ports will come in handy.

Conclusions

So, should you be tempted by the new Raspberry Pi model? If you're planning on doing anything that requires even a modicum of processing power, then the new model is a must. Not only does it remove a lot of the frustration that can build up from using an under-powered device, but it opens a number of doors to new projects that can really take advantage of the upgrades.

What is really exciting, is the announcement from Microsoft that they're creating a freely available version of Windows 10 that will run on the new Pi 2 as part of its Windows Developer Program for IoT.

There are also a number of other things I'm keen to try out on the new board, game emulation being one. The Pi 2 should be able to handle PS1 level emulation well (as opposed to older 16-bit systems on the original), so if this is an area you want to look more into, then the upgrade is definitely worth looking at.

That said, there are a couple of things to consider before you make your purchase. If like me you've got the original Model B, then this pre-dates the addition of a couple of USB ports and a re-jig of the board’s components. What this means is that you'll also need to splash out on a new case as well. Not exactly a huge investment, but definitely something to bear in mind. If you've got the Model B+ then any cases you've got are still compatible.

Posted in |

Asus Xonar DGX 5.1



Recently I acquired a new screen for my gaming PC, a Panasonic TX-L65WT600 65-Inch 4K panel to sit in my lounge (I use a wireless keyboard and mouse/gamepad for gaming), replacing an LG 50” Plasma that was 1080p.

3-full.jpg

Having heard about the benefits of Ultra-HD, I was excited to try it out from a PC gamer’s point of view, and hopefully some of you might find my initial impressions of the new format interesting!

Initial Setup

When booting into Windows 7 for the first time using this screen, my initial impression was “Everything is TINY”. Desktop icons and the taskbar were so small that I could hardly make any of them out, but when I ventured nearer the screen, they were all perfectly displayed and it was difficult to make out individual pixels.

1-full.jpg

In order to make everything viewable from a distance I set the desktop resolution to 1920x1080 (using an HDMI cable). In theory this should work as 4K is 4 times the resolution of 1080p (meaning 4 pixels at 4k = 1 pixel at 1080p). This did work nicely, however in order to get above 30FPS in games a DisplayPort cable is required, as HDMI 1.4 doesn’t have the bandwidth required for 60FPS/60HZ.

I therefore plugged in the DisplayPort cable and tried to do the same thing with the desktop resolution, but to no avail. It appears that supported resolutions with DisplayPort differ greatly than with HDMI (setting custom desktop resolutions with the Display Port cable ended up crashing the Nvidia drivers), so I had to leave it at 4K native resolution, but increase the scaling of text and icons within Windows to 200%. This is due to the way that DisplayPort sees the monitor. It essentially splits the display into two halves vertically, and then stitches them together to create one display (the BIOS screen and windows welcome screen is split vertically into two identical screens until the desktop is displayed).

The main problem with using DisplayPort is that when gaming in 4K it works flawlessly, but if the game is too demanding for 4K and you want to play in 1080p or 1440p, you simply can’t. The resolutions aren’t supported. There are a bunch of resolutions available below 4k, but most appear in strange ratios and are seemingly unusable. This could be a problem with Nvidia’s drivers, the panel itself or DisplayPort, However, after searching for solutions I simply couldn’t find one. There are scaling options in the Nvidia control panel to stretch the display manually to fit the screen, but these were disabled when using Displayport.

4-full.jpg

It became apparent that the newly introduced HDMI 2.0 standard would be the best solution, and waiting for updates from Microsoft (in the form of Windows 10 or other software updates) should hopefully combat scaling problems.

Games

Putting the teething issues aside, I loaded up Crysis 3 to put my GTX 780Ti through its paces. Setting the resolution to 4K (3840 x 2160) and all other options except Anti-Aliasing to “Very High”, I was greeted with the most breathtaking gaming visuals I have ever seen. Textures were crisp to the point of being photo-realistic and the game just looked amazing. At 25 Frames Per Second. Hmmm.

It became quickly obvious that one of the fastest single-chip graphics cards in the world was simply not good enough to run games with this standard of visual fidelity, as it has to push 4 times as many pixels as 1080p.

I did find, however, that Anti Aliasing was not particularly necessary at this resolution due to the sheer density of the pixels, which was a bonus.

Running other (less graphically demanding) games was a joy. Diablo 3 and Borderlands 2 looked amazing at 4K (easily running 60 frames per second), and I found no issues with the HUD on screen, as it seemed to scale well with the resolution, keeping its original size relative to the game. Far Cry 3 and The Witcher 2 also looked outstanding, but averaged 30-40 FPS each.

I found that the increase in resolution also increased the Video Ram usage of the graphics card, and the 3GB on board the 780Ti was quickly filled up with Far Cry 3’s textures. It seems that future cards may have to be shipped with more VRAM in order to cope with graphically intensive games running at 4K and avoid stuttering whilst textures are loaded in and out of the GPU’s memory.

Films/Media

When watching BluRay content at 1080p, the screen held up very nicely. The resolution was maintained well, and looked exactly like it did on the 50” TV. Game of Thrones and Life of Pi were still very crisp and I didn’t notice any loss of quality by the screen up-scaling 1080p content.

Unfortunately there really are no 4K output formats apart from games to enjoy on this screen at the moment – Youtube’s 4K streams are woefully compressed to the point that they seem only slightly better than 1080p and Netflix will only stream 4K to devices such as consoles and specific, built in, TV apps (4K streaming for PC’s is on its way according to Netflix support but if you’re looking forward to watching House of Cards or Breaking Bad in 4K on your PC, you’ll have to wait for now).

There is also no industry standard agreed-upon format for 4K media yet, so streaming and playing directly from hard drives is the only option at the moment.

I have scratched my ultra-high resolution video itch by watching downloaded clips of nature films with ‘epic’ music dubbed over the top, although each 1 minute clip is about 400MB in size. One downloadable 4K film is over 360GB!
If you want to watch anything in uncompressed Ultra-HD, unfortunately for now you’ll have to settle for watching a one-minute long video of a bee buzzing around a flower, set to ‘Sail’ by AWOL Nation.

Viewing Distance

When using the display I sit around 7 feet from it on the couch. At this distance I found that switching between 1080p and 4K did have a very notable difference in the amount of clarity, but I needed to get closer (about 3-4 feet) to really see how different it was. At this distance, in 1080p I could easily make out individual pixels and the image had a very ‘granular’ appearance to it, with a noticeable screen-door effect. In 4K however, I didn’t notice any pixilation whatsoever. The games looked just as clear from 3 feet away as they did from 8 feet away. In fact, it was difficult to make out individual pixels even with my face almost pressed against the screen. Due to this, I have found that in order to truly experience the clarity of 4K resolution, you’ll need to sit close to the monitor. In fact, too close for comfort in a lounge setup.

Further Usage

After using the display for a few weeks I have finally settled on using a resolution of 2560 x 1440 for most games over HDMI 1.4. With this resolution I can enjoy 60FPS over HDMI and it adds a nice amount of fidelity to games over 1080p, without impacting too much on performance using the single 780Ti. This is until I invest in a GTX 980 (or 2) with HDMI 2.0 capabilities.

One final note is that with the new consoles out, developers seem to be upping the usage of VRam for PC ports, with recent games such as Bethesda’s “The Evil Within” and WB’s “Shadow of Mordor” recommending 6GB for ultra textures, even at 1080p. When switching to 4k, this will inevitably demand more VRam use, and therefore it may be worth waiting for cards with 6-8GB VRam before 4K becomes a truly viable gaming resolution.

Posted in |

D-Link DNS-320L

Have you ever had a situation where you need to share a few large files between a PC and a Mac across the home network? If the answer to this question is "No" then count yourself lucky! It is truly one of the most headache-inducing tasks I have tried to do recently.

If the answer is "Yes" then I think you'll sympathise with me. What sounds like the most simple of networking tasks actually turns out to involve hours of Google searches and shaking one's head in frustration.

The situation was simple. I had a large number of files in Windows 8.1 that I wanted to transfer to a Macbook Pro. These totaled about 40GB in size so I didn't want to go through the hassle of putting them on an external hard drive, then transferring files from the drive to the Mac as it would have taken far too long with only a USB 2.0 hard drive at hand. Instead, I thought I would simply share the file on the PC with the homegroup and pick it up from the Mac over the network. It is a lot more hassle than this, however. There are a number of guides on how to do this process on Google, but unfortunately none of them worked for me, due to the Mac requiring non-existent passwords for the PC and it not picking up the Homegroup properly.

This process made me think that it would be great if I could have access to any of the files I wanted to share, from any device, even if my PC is turned off.

The solution to this?

Enter: The D-Link DNS-320L 2-Bay Cloud Network Storage Enclosure!

This neat little black box now sits on top of my router and contains all the files I want to share with devices I specify via a simple web browser or as an assigned drive on any laptop or PC connected to the home router.


Main unit.jpg

Using two 1TB Seagate Barracuda drives, the setup wizard allows you to format the drive configuration in the following ways:

- Standard (two separate volumes on the NAS);
- JBOD (combining the drives for maximum available space);
- RAID 0 (combining the drives for maximum available performance); or
- RAID 1 (mirroring the drives so that if one fails, the other will still contain all the data). This option, however, only allows for a maximum of 1TB of space.

I opted for the RAID 1 configuration for 1TB of space. This is because the data I wanted to store on the device is important for me to keep. If it weren't as valuable, a better option would be to configure the drives in RAID 0, in order to increase the read/write times and gain more storage space.

IMAG0084.jpg
2 x 1TB Drives Ready for NAS

IMAG0085.jpg
Drives in-situ

The setup program formats the drives as appropriate (this process took about five minutes in total), then the device appears as a network drive in the Explorer window on a PC. On a MacBook, the DLink device appeared as a shared device in Finder, which could be accessed using the username and password set up during the installation. Accessing the management window of the NAS is done by simply entering the ip address of the unit in a browser window. Easy peasy!

Local web interface.jpg
NAS Web Interface

When transferring files over from the PC to the NAS over my home network, the speed averaged around 35 MB/s; streaming music, large audio files and 1080p video directly from the unit over home WiFi was flawless, reading files at around 43MB/s.

The D-Link DNS-320L also has the option to sign up to www.mydlink.com in order to access files from the NAS anywhere in the world. After a simple registration process, the web interface becomes available and is very easy to navigate, and download and upload files to the device. There is also a very useful "MyDlink Access-NAS" app on an Android or iPhone that can be downloaded for free, allowing backups of material from a phone or the ability to access files stored on the NAS. You can also stream music from the device straight to the 'phone with the built-in media player within the app (depending on the speed of the data connection) and even set the app to automatically save photos taken with your 'phone's camera straight to the NAS. Handy!

Phone app.jpg
Browsing Using the Phone

NAS online.jpg
MyDLink Interface

Downloading from the web interface was very quick - around 6MB/s at home and around 1.5MB/s on 4G using an Android phone (this being around the maximum speed I usually get for downloading anything from the Internet).

Along with great media storage and playback facilities, this home network storage solution provides a number of practical applications for backing up whole drives (for example, setting a Windows backup to run every week straight onto the NAS for peace of mind). I have a 'fresh skeleton build' of Windows backed up now containing programs that I frequently use such as Chrome, VLC player, Steam and Origin to name but a few, so whenever I want a fresh installation of Windows I can simply restore this build from the NAS rather than spend time re-downloading all the necessary programs. This can all be managed through the "ShareCenter" web interface within the unit and is very user-friendly and easy to navigate. It also includes options to control home security cameras linked to the device if you so desire.

For the price paid for this little unit (around £40 at the time of the review), the ease of use and the peace of mind it offers regarding backups and storage of files, it is a cost-effective, user-friendly and convenient solution to file sharing. The DLink DNS-320L is a fantastic piece of kit that I would wholeheartedly recommend as a basic home network storage solution.

Posted in |