Category Archives: Technology
A few days ago I had cause to register a new Internet domain name for a project I’m working on.
On this occasion it was a “.net” domain.
Unfortunately for some reason I wasn’t given the option of taking the “Identity protection” option at the time of purchase, and I’ve not previously had too much in the way of issues by not taking it, so didn’t really think much of it.
This time however, proved to be different.
The WHOIS records for the domain concerned contained full name, address, and telephone number contact details.
Within hours of registration I was starting to receive spam e-mails like this one:
Anyone with a degree of knowledge will know that search engine registration is free, and threats like this one that if you don’t (implied: purchase their service) your website will not be indexed by search engines is misleading at best, and outright deception at worst.
What was slightly more sinister this time was the fact that so far this morning I have received no less than five unsolicited phonecalls from foreign sounding ladies & gentlemen, and in one case a silent call directly to my mobile. The calls I actually answered all wanted to talk to me about website development, or search engine registration.
This takes publicly available personal contact details, and exploits it in an entirely new way. They MUST get enough takers to make this worth doing, but it alarms me to think that people would fall for this.
I’ve subsequently enabled ID protection on the domains I can, and have requests outstanding to complete on others.
I am using a fairly advanced spam filter on e-mails, which is “learning” about these new unsolicited e-mails every time they come in, and I use Truecaller on my mobile, and Sky Talk Shield on the home phone number.
I wonder how many more will bug me?
I got a reasonable score for my day to day password! 🙂
Having decided to have a play with Cisco’s VIRL solution, I’d intended to set it up on an old Laptop that I had laying around, but the requirement for a minimum of 4 CPU Cores, vt-x support (for Hardware assisted Virtualisation) and no less than FIVE Network Interfaces, I decided that I’d be better off using VMWare Workstation instead. VIRL doesn’t run under VirtualBox so VMware it is.. One license purchase later, and I’m merrily installing it on my Core i7 3820 machine, with plenty of RAM, Disk space, under 64-bit Windows 8.1, no problem! or so I thought!
I vaguely remember having issues with VirtualBox and the hosting of 64-bit Guest OS’es before, but since my need at the time wasn’t too specific, I didn’t really spend the time trying to resolve it, but I was surprised to find once installed, that VMWare Workstation 11 was nagging me that a 64-bit guest OS was not supported on my machine. I’d checked the pre-requisites quite carefully.
Of course I Googled, quite extensively, and didn’t find much that described my exact issue. and yes, I checked in my BIOS that the Hardware Virtualisation settings were enabled (they were!); I was definitely running a 64-bit OS, so why would it not work. My CPU, an i7 3820 definitely supported the Virtualisation extensions, but the Intel Processor Identification Utility stubbornly disagreed! My rig is a 2-3 year old custom build from Mesh Computers, based on an MSI X79A-GD45 main board. There should have been no issue with Virtualisation, and there was certainly no issue with running 64-bit Windows on it!
After much research, I stumbled on an article from VMWare dated December 2008 that suggested that VT-X was often unavailable to normal software if “trusted execution” was enabled. That sent me off in to the ClickBIOS once more as I recalled seeing a setting for an “Execute Disabled Bit” which was ON, so I decided to try turning it off and see what happened. I’ve attached the in-windows screenshot of the BIOS simply because it’s easiest to capture, but the setting was found in the same place at boot time.
It was tucked away under Overclocking Settings, and CPU Features, just before the Intel Virtualisation Tech and VT-D Tech options. Sure enough, disabling this caused VMWare to be quite happy with a 64-bit guest OS, and the Intel CPUID tool also now claimed that I was capable of Virtualisation.
PROBLEM #1 Solved!
Next step will be downloading and installing VIRL itself within a Virtual Machine. That’s a story for another day!
Okay, so I’m not very good about keeping my promises to post on here more often!! Hopefully that’s about to change, and more about that in a few minutes.
Quite a bit has happened since I last posted. Most particularly of note would be the delivery of my nice shiny new car on 20th May, closely followed by me and it being #3 in a 4 car sandwich no less than 20 days and 1046 miles later. As I write this post, the car is still with the repairers, and I’ve now been driving the courtesy car for as many days as I’ve driven the Mercedes! I haven’t gotten around to taking some nice pictures of the new car yet either.
My PLEX Library is now up to over 480 films, and more than 2000 episodes of 110 different TV Series’. I’m working my way through my DVD’s slowly, but I’ve not found a way to Rip them that’s totally to my liking as yet; my current process is fairly slow, and results in files that are rather larger than I would like!
Anyway, I’ve decided to try and re-boot my technical interest once again. I made the decision some years ago that I’d allow myself to be drawn in to the world of Management rather than being a tecchie, and while I don’t necessarily regret that decision, I have found of late that I feel as though I’m loosing some of the technical skills which allow me to add my particular value in the role that I currently hold, so I’ve signed up for a year’s licence for Cisco’s VIRL programme, so that I can play with, and brush up on some of the newer solutions that I’ve missed out on in recent years, including NX-OS, and IOS-XE, as well as allowing me to play with F5’s, and Palo Alto firewalls in their virtual flavours. I’ll endeavour to report back here on my findings, and any useful technical blurb that I find, in the hope that at some point, someone might find it useful.
So, following hot on the heels of this post will be my first technical issue, identified while preparing to set up VIRL under VMWare Workstation for the first time.
I’ve been doing more musing than usual recently on where I think technology evolution in the Network arena is heading over the next few years, and the concept of a Virtualised CE Router keeps popping in to my head. This entire post is a bit of blue-sky thinking, but it’s not that far away from where we are today.
I think of the idea as a logical next step in the Hybridisation of Virtualisation and Network Function Virtualisation with that of Software Defined Networking.
Virtualisation has already taken over the Data centre, with VMWare and others having the capabilities to provide logically discrete Virtual Switching, Routing, and Firewall instances within the cloud infrastructure, so why not take it to the next step and start to consider Virtualisation for some of the additional services we might want to use? Indeed the IETF has a draft considering exactly this for MPLS VPN’s.
Current WAN networks follow a fairly traditional delivery model in that the edge of the carrier network is terminated on to a local piece of Customer Premise Equipment (CPE), which in turn is connected to a “Customer Edge” (CE) device usually provided by the Network Operator. Domestic DSL services follow a similar model.
My vision of a Virtual CE device fits both the conventional WAN solution, and in particular MPLS type deliveries, and a consumer grade DSL service.
Ethernet is increasingly becoming the bearer of choice for MPLS and Enterprise WAN services, either using Copper or Fibre, and terminating on an RJ-45 Ethernet port on the CPE. Since this is literally an Ethernet service delivery, why not shift the “intelligence” back to the other end of the circuit? Enabling the Service provider to virtualise the physical and provide a logical instance delivered from a shared hardware platform. This reduces the equipment that could “go wrong” on a customer site, reducing (but not totally eliminating) the potential need for engineer visits, break/fix maintenance, and ultimately to save costs. The carrier can also standardise the services that the customer takes, and capitalise on investment in centralised CE equipment. It would still be possible to use tagged Ethernet to deliver traffic to different Networks/VLAN’s for the more sophisticated requirements, and doesn’t really change the scope for screwups which could cause traffic to be delivered in to the wrong logical networks due to mis-patching, (although I do know of a solution that might help there too! 🙂 )
Extending this line of thought in to the Consumer market, I think that It has massive potential there too. It may still be necessary to have an intelligent black box of a sort as a CPE to provide a Layer 2 connection back to the intelligence in the Virtualised CE environment, (using something like L2TP over DSL to the virtual CE router?). Of course local WiFi breakout services will also still be required (Cisco already have the Meraki Cloud-managed Access Point range) but nevertheless similar benefits around centralisation, management, and economy of scale could apply. Consumers could still manage their own CE device via a browser, but the carrier could have a far greater degree of influence/control over the make/model of CE device the customer uses enabling standardisation as well as opening the door to many more value-added services that the carrier could provide. Some possibilities include:
- Central, Redundant, Backed up Network Attached Storage
- Media Centre/TV and related services (XBMC/Netflix/Plex/Sky Plus/Virgin TIVO etc)
- Remote Access/VPN
- Firewall & Security
- Shared Access (Data Sharing, Gaming, etc)
- Content Filtering
Taking those points in order:
Network Attached Storage: How many high-tech families (read: geeks) have sophisticated home networks with Network Attached Storage capabilities, used to backup Photos/Music/Documents, or other locally stored Data? This type of virtualisation could allow the carrier to provide (sell!) Exchange or Data-centre based NAS/SAN capacity.
Media Centre: What about those people using Media Server(s) running on a NAS or dedicated server Hardware? iTunes or Airplay servers to stream music to a SONOS or similar? Centralised access to subscription based TV services such as Netflix or Amazon Prime Video, or even inbound access to your Sky Plus or Virgin TIVO? Local storage (maybe on NAS?) of your own movies using Plex or XMBC?
Remote Access/VPN: I can only predict this area will grow and grow. I currently have the capability to establish a private VPN connection to my Home Network in order to access data stored on my NAS etc. As the trend towards the “internet of things” accelerates, I predict that this trend will only increase over time as we access additional home based solutions including Lighting, Home Security, Central Heating, Electric/Gas meters, even Cookers and Freezers etc going forward.
Firewall & Security: We all hear about the latest and greatest zero-day exploit and such, wouldn’t it be great if we could sit back secure in the knowledge that our service provider was protecting us against these threats centrally. Integrating this measure of control behind an easy to use UI to facilitate:
Shared Access: Already we find the younger generations gaming together within the same house on their respective games consoles with LAN enabled gaming, and of course MMORPG’s are extremely popular too! Why not have the neighbourhood kids playing Minecraft together on a private server that only they can get to? This is about the ability to selectively extend parts of the Network between entities (on a selective and controlled basis of course). Want to access that particular music track at home while you’re visiting a friend? no problem!
Content Filtering: How about being able to deliver different levels of filtering, maybe to different Wifi SSID’s or LAN ports on the black box locally? How about separate SSID’s for “Adults”, “Teenagers”, and “Children” each with differing levels of content filtering, maybe even logging applied.
And of course that’s before we start entertaining the ideas of Desktop-as-a-service, or the shift of compute workloads to the cloud. I’m pretty sure it’s only a matter of time before we shift the work behind our games consoles away from black boxes in the home, and just use a virtual-screen display type solution for it all! (nVidia SHIELD?)
I know that much of this can be done today, but it requires a particularly persistant technical person to make it all work, and even then it’s not yet as seamless as we’d all like! I think that the idea of Virtualising the CE takes us a step towards my vision, and is a potentially lucrative area for the carriers to invesigate.
What do you think?
Late last year I purchased an Amazon Fire TV
I wanted it to facilitate my watching Netflix and Amazon Prime Instant Video without needing to turn on my Xbox, which was rather noisy and convoluted to get to the applications, not to mention a little noisier and clunkier than I would have chosen.
The Fire TV box looked to be ideal, and with the changes Amazon made to Prime in the latter half of the year, it became an easy decision.
What I didn’t count upon was discovering Plex and the Plex Media Server.
There is a Plex app which can be installed on the AFTV, which enables a connection via the Network to a Plex Media Server which can run on a PC (Windows, Linux or FreeBSD) or Mac, or if you’re fairly lucky, a NAS. My 3 year old QNAP NAS allows me to install it, but just doesn’t have the CPU or Memory to successful run the Media Server, let alone handle the Transcoding workloads it generates, so I might just have to consider a NAS upgrade in the future. In the meantime, running the PMS on a Windows Desktop PC and creating links to mapped drives containing the content that sits on the NAS seems to work well enough!
So then I started to dabble a bit in to creating the content to feed it. Now anyone that’s been to my place will know I’m something of a quality fiend, enjoying High Definition picture quality from Blu-ray, and 5.1 or higher surround sound, so I didn’t really want to loose any of that when streaming, but if I could achieve an acceptable balance of quality against size, it would mean that I didn’t have to keep reaching for physical media when I wanted to watch something, which would be fantastic!
After a fair bit of trial and error, and extensively searching t’interweb, I’ve settled on a two stage process. Firstly, I use MakeMKV to “Rip” a Blu-ray disc. This generates a lossless source copy of between 20-25gb (depending of course on the source). This is a legal action in the UK now since 1st June, assuming of course that you legitimately own the source media and aren’t ripping borrowed or loaned media. This follows the Hargreaves Review, see here and here for details. It takes me between 20 and 30 minutes to generate this source file on a fairly well specified (but two year old) Haswell i7 (3820) CPU. There is a trick to using MakeMKV, and that’s to make sure you choose the correct Soundtrack that you want to keep (eg English/DTS) and if you’re like me you’ll only want to rip the Movie itself, and not bother with the other extras on the Blu-ray disc.
I then take this rather large .mkv file, and run it through Handbrake, having set the video to an RF value of about 20 for Blu-ray source media, and in another 25-30 mins I’ll re-encode the .mkv in to an H.264 .mp4 file, which is usually between 6 and 10gb in size, depending on the length and quality of the source file, and would therefore allow me to store between 200 and 300 files on a 2Tb HDD.
Then it’s just a question of setting up Libraries on the Plex Media Server for the type of content. This isn’t difficult, but requires a little thought and planning. I created separate libraries for Film and TV Series’ because Plex allowed me to separate them. This makes sense given the Media Enrichment capabilities Plex has, where it will try and identify the media from the filename, and will then download from IMDB and/or elsewhere extra information about the cast/crew, posters and/or thumbnails etc and generally make it look very sexy on the screen. For TV programmes, it will allow drill-down by Series too, so you can choose Series -> Season -> Episode.
Above is the view you get from a PC or MAC running Google Chrome and browsing to the Plex Media Server. The above shows all of the Movies I’ve got stored in my Library.
If I click on any of these, I’ll get a further display with details of the cast, crew, and a synopsis of the plot, all enriched with background images taken from the movie, and where Plex can find it, the theme music too.
The same process happens for TV programmes, with the added step of choosing a Season if possible too for the TV programme.
The User Interface from the AFTV itself is remarkably similar, with the same enrichment capabilities, and the whole process just works so very well.
In fact on the AFTV, it will also tell me the quality of the Video and Sound too! If you look in the bottom/left of the photo below, you’ll see an example showing the 1080p, H.264 and dts 5.1 sound.
So, my project has become to further expand my library, and transfer as much of my blu-ray library on to Plex as I reasonably can. This will doubtless take me years given the rate that I accumulate them, but I’ve already started doing this with “new” media as it lands, so I can watch it without media at my leisure.
Any Networking/IT types out there that happen to come across this website might also want to take a look at http://www.sergeantclip.com
Please help me spread the word, I truly think this product is brilliant – it’s so simple and effective at what it does, and it’s amazing that it’s not been thought of before!
Being myself a product of the British Educational System, and having followed a fairly conventional path in to my current career, I have long felt a substantial frustration with the direction of our current higher educational system.
I am of course talking primarily about the prevalence of “Hairdressing”, “Media Studies”, “Photography”, and “Tying your Shoelaces” courses which seem to be so popular with the modern youth. Coupled with this has been a somewhat backwards attitude towards Computing and IT qualifications, or “ICT”. Thinking back to when I graduated some 15 plus years ago now, some of the available courses were somewhat out-of-touch with the marketplace, teaching obsolete programming languages (COBOL) and disciplines such as JSP which the real world had left behind. For the first two years after graduation, I sept unlearning, or more specifically relearning what I actually needed to have a real career in Computing and IT, but to be fair, despite teaching obsolete topics, the courses had equipped me with the basic approach and knowledge I needed to apply to any language or discipline that I required. Indeed it would be foolish to assume that I would be programming in COBOL or PASCAL for the remainder of my programming days, knowing how the market was going to evolve in unknown directions.
The Technology sector is the industry in which I work, so I have a huge vested interest in caring about the direction it takes. For me it is the future, it is replacing many of the White collar types of work our Parents could or would have done in their lifetimes, and it has the potential to become the new wealth creating sector of the economy, since it’s evident we just cannot compete in Manufacturing, and have exhausted most of our Raw Materials already. In my estimation this makes it so vitally important for our economy, even for those that have no interest in working within that sector.
For me then, the Educational System has let the Technology sector down badly for the last ten years or more. Many people don’t seem to have much of a clue about the difference between Computing and IT, or “Information Technology” to give it it’s full title. For me, IT is about using the modern tools available to do a job. It’s not specifically about that £500 PC or £1000 laptop sitting in front of you, but it’s you being able to use that piece of equipment to help you complete a task of some sort; in the same way Screwdrivers, Paintbrushes, Spanners, and Shorthand might have been to support other professions.
If we are honest, in recent years this has translated in to needing to be able to use a Mouse and Keyboard, to navigate Windows, MS Office, and yes, even to know how to use popular Office suites such as MS Office, or perhaps using a Tablet or Smartphone. It is absolutely vital that this particular facet remains current and aligned to market trends. The downside of this in the last 5 years or so has meant knowing the Microsoft Product suite, Excel, Powerpoint, Access, and Word specifically. Of course in order to do all of that, you don’t need to understand much about how a computer works, or why it does what it does; No knowledge is needed of Binary, of Electronics, although a high level knowledge of Computer components would probably be useful. All of these skills will be handy for just about any student of any discipline these days. In much the same way as English and Maths are part of a standard curriculum, so I believe Information Technology should also be included as standard, and probably in the first 2-3 years of Secondary Education (I believe they call it Years 7-9 these days!). I know “Graduates” of a Higher level Computer Programming course that think programming is using Visual Basic for Applications within Access or Excel, when in my estimation this is “just” a tool (and admittedly a fairly sophisticated skill to possess) to do a job which ought to be a product of the Information Techology arena.
Contrast this with a Computing course, or more correctly titled a course in Computer Architecture or Computer Programming, which clearly would require a much lower level of detail around how computers work, and why they do what they do; Perhaps a missing element in Networking could be folded in to the Architecture section? In any case, out of this fundamental knowledge of Computer Architecture grows a necessary understanding of Operating System concepts, which in turn grows in to API’s and programming languages that may or may not be platform specific.
I have recruited and interviewed staff several times during my career to date, and each time I’ve found it necessary to explore just what content the Computing courses a candidate has on his CV actually contained, and from that establish whether the basic level of required knowledge is there. In the Networks space, it’s quite astounding to see how few people in IT actually have a grasp of how the Internet works, and how applications could be exploited over the Network if not written properly.
So, enough moaning, Why are we on the brink of something wonderful?
Two reasons…. Firstly a Slice of Pi.
The Raspberry Pi is a fantastic concept which has been long missing in the marketplace. It is a simple, and cheap (not just cost effective) solution for providing Computing Power and technology. It literally costs about $30 to produce, and the Raspberry Pi Foundation does not aim to turn a profit, so this is loosely the price to you as an end user. In exchange you don’t get a finished product nicely cased with lots of glossy instruction manuals; Instead you get a completed PCB slightly larger than a credit card, but probably smaller than the smartphone in your pocket, with a Media Card Reader slot, a couple of USB ports and a Network interface, and an HDMI Out so you can plug it in to a modern TV to use, and not much else! No PSU, no Keyboard/Mouse, and no Operating System is provided; instead you have to use another PC to write the Open Source OS on to a Media Card, and then plugging a USB keyboard & mouse in and away you go!
Currently most of the available OS’es are Linux based, but I expect this will change over time. The Pi uses an ARM processor based on a Broadcom System-on-a-Chip, and has a limiting 256Mb of RAM, but again this costs $30 (or ~ £30 for us Brits).
What an amazing Catalyst for learning the Pi could be! Providing cheap, accessible, programmable computers everywhere! It’s aimed at the Educational sector, and it fits in so well with my vision of a Computing course it’s astounding. It could be used to learn Web programming! Several of the currently available OS distros can support a LAMP (Linux, Apache, Mysql, & PHP) stack, and of course it facilitates a clear understanding of the system architecture and indeed it’s limitations play to this as well, having to write code that performs well and within the capabilities of the Smart Card based Storage and 256Mb of RAM.
Other options are of course there too, with C and Pascal compilers, Basic compilers/interpreters, and so many possibilities it’s incredible. The Pi doesn’t lack for processing power either, it’s capable of outputting an HD picture to a TV and decoding video at 720P resolution on your HDTV, or even playing Quake.
So, if you are a parent with a curious youngster, I strongly suggest you think about investing in one! It’s not going to replace your family PC, at least not yet, but what better way for both you and your children to learn together about Computers? And it doesn’t stop there either; Schools adopting the Pi in to the curriculum could develop Craft Design Technology (or whatever they call it these days!) sessions around making cases for the Pi, and so much more!
The second reason dovetails neatly in to the Slice of Pi. If Pi helps shape the Computing aspect of Technology Educational needs, then the Governments’ brave decision to effectively scrap the current ICT curriculum in January this year, and to invite dialogue with the Technology Industry on developing a replacement has kindly facilitated the other missing part of my vision.
The so called “Microsoft GCSE” has the potential to deliver Students with real skills in modern technology solutions out to the workforce in a few years time. If at 16 or 18 a student had the ability to write applications for a modern Windows PC using Visual Studio, or to interface with and control a SmartPhone, or even produce Apps for that SmartPhone, then the system has done it’s job, and will be producing worthy candidates once more.
I know that other vendor entities including Cisco have been engaged in discussion over the future curriculum too, (see here), perversely the Government has even been criticised for perhaps listening too closely to the vendors, but at the end of the day it’s Vendors like Cisco, Microsoft, and the Raspberry Pi Foundation that are going to be leading the market in all sorts of directions over the next 5-10 years, so why should we not pay close heed to their needs?
If all of this happens as I sincerely hope it will, UK.plc has the potential in years to come to retake the worldwide lead in the Technology market, and to turn out some of the most supremely and more importantly usefully qualified students which will go on to lead our economy who knows where! This is something which we’ve been fighting to do for years now in the face of strong competition from Asia and the Pacific.
This is all my own opinion and perspective, and of course it’s possible I’ve got it all wrong, and my Vision is naught but ideas in my head. I sincerely hope not, but all the same it makes me feel like we’re on the brink of a whole new journey now, so let’s make the most of it together, and see where it takes us!
(I am currently in a waiting list to get my own Raspberry Pi).
This evening while faffing about with Websites and things, I realised that it’s April already, and that I’ve not updated this site with any new content Since January!
I really must make more of a conscious effort to update my witterings here, even if they are seldom read by anyone other than myself!
So… what’s new? Probably not much!
- I’ve driven nearly 18,000 miles in my (no longer quite so new) car in the last 9 months!
- I’m getting to know the road between Home and Chichester extremely well indeed! (for Work!)
- I’m still working too hard!
- I’m still just as much of a geek as ever!
I think this year is going to see some interesting changes for me in a number of areas. I expect that in January next year, I’ll look back on this post and think – Yes, I was right! Watch this space!
So I’ve refreshed this site with a new theme! I expect I’ll be buggering about with it for a while!
I’ve helped my cousin and her man launch a new Website for their business. Why not pay a visit to www.heathmorgan.co.uk and have a gander?
Now I really MUST make sure that I find some new content to put on this website more often!
Ok, so my old trusty Acer laptop finally gave up the ghost, with some very odd distortion on the screen meaning everything was either psychedelic green or pink! It’d done me a good turn, and it surprised me when I sat to think about how long I’d had it, and the “designed for Windows XP” sticker proudly on the front really should have given me a slap in the face. Still, having survived at least one HDD and RAM upgrade, and two OS upgrades, to Vista and then to Win7, it really didn’t owe me anything.
I thought long and hard about what to go for; Did I want a really powerful desktop replacement style laptop, with a Core i7 and all the whistles and bells, or a fast & light Ultrabook?
In the end I opted for the Asus N55S, which is a really beautiful piece of kit. I can’t seem to find one that matches my spec exactly on line.
Specs are: Intel Core i5 2430M v2.4Ghz CPU, 6Gb RAM, 6x Blu-ray Reader/Writer, 500Gb HDD, Nvidia Geforce 635M GT Graphics with 2Gb dedicated VRAM, USB 3.0 Ports, & Built in Bang & Olufsen ICEPower Sound System (with separate Subwoofer).
Overall this is one very sweet laptop. It’s powerful enough to act as a desktop replacement for when I’m out and about. It’s not an ultrabook by any means, being fairly large and weighty, but that doesn’t bother me too much.
Here’s the intro Video so you can see for yourself!
I can thoroughly recommend it!