Monthly Archives: January 2013

Art’s Predictions as Cyber Security is Re-thought

Art Coviello, CEO of EMC RSA has made his predictions for security in 2013. (Forbes here). It does read a bit like the good, the bad and the ugly… (maybe not in that order). However it is worth considering his thoughts!

bigstockphoto_Internet_Security_98254

If you are pressed for time, Art’s 8 key points are:

  1. The      hackers will likely get even more sophisticated. 
  2. Our      attack surfaces will continue to expand and any remaining semblance of a      perimeter will continue to wither away.
  3. These      changes will occur whether security teams are ready or not.
  4. National      governments will continue to diddle or, should I say, fiddle
  5. It      is highly likely that a rogue nation state, hacktivists or even terrorists      will move beyond intrusion and espionage to attempt meaningful disruption.

The good news:

  1. Responsible      people in organizations from all verticals, industries and governments      will move to that newer intelligence-based security model
  2. I      also predict a significant uptake in investment for cloud-oriented      security services to mitigate the effects of that serious shortage in      cyber security skills.
  3. Big      Data analytics will be used to enable an intelligence-based security      model.

 

The interesting piece for me is the evolution of the journey from the edge to an intelligence based security model. My analogy is James Bond! The old method is like the scenes, mostly in the older movies, where James had to climb some impossible mountain cliff face, breach a more impossible wall, fight hundreds of heavily armed guards… and then save the maiden! The new approach is where James is right inside the ‘bad guys’ domain, (sometimes involving a game but with real consequences), and the strategic manoeuvring is played out.

ON-LINE Event: Anyway something big seems to be going down this week, register here. If you can catch it live at 1am or 5am… or maybe a more sensible option is next week at 2pm Sydney time, on 5th.

Advertisements

Are you hoping for an IT Disaster?

Datacom, a good partner of EMC, was kind enough to invite me to participate on a panel discussion at their event yesterday. The keynote speaker was Martin Grunstein who’ message is very grounding and he delivers it by drawing on his stand-up skills, very enter-training. (I actually saw him at a Microsoft event over 10 years and still use his advice, how many keynotes can you remember?).

On the panel with me was the CIO of Brighton Adelaide and as one of the companies that has recently gone live with VPLEX, I asked him how it was going. His reply, with a little smile on his face was, “We are kind of hoping for a disaster!” to which we both followed on with, “be careful what you wish for”. But just think about that! How many of you are that confident in your operation that you do not fear and dread a disaster… or even a disaster test!

Then, very carefully I sort of gingerly asked if that was the main reason they had implemented VPLEX. (My mind re-living the sales training where someone says.. who sold you that then!). With a sigh of relief he said no, not just for DR! Just remember that even just for downtime, only about 5% is caused by disasters.

Now we all know that DR is a major discussion here, but to me, it’s a minor part of the benefits of VPLEX’s true active-active capability. (I tell people to just think that you have one-datacentre, it just happens to be in two physical locations.)

The one benefit that is not that obvious is that there is an immediate hardware savings in the cost of doing DR. Let’s say you wanted to provide 50% capability in the time of disaster. You would purchase 1.5x the capacity you needed or production to achieve this, with VPLEX an ideal case is that you purchase 1x and put half on each site… and you have the same level of protection!

Also it takes the pain out of lifecycle management.. of the entire datacentre… need to upgrade the air-conditioning? Just move or continue to run the apps on the other side! Need to refresh the storage… Tuesday at 11am sounds good doesn’t it!!

Anyway… here’s wishing for a disaster… not!

It’s nice to be vindicated, isn’t it? (BIG data meets Security)

I hold a series of C’level roundtable discussions every quarter for a select group of customers and prospects of EMC. At all of these lunches the talking starts as we sit down and no-one seems to take a breath until the venue throws us out. While discussing security last quarter, I spoke about the role of ‘Big Data’ which was met by a collective sigh around the table… and a comment about “Clive’s hobby horse!”

drown shark

Last week I was vindicated with a press release titled, “Security Leaders Urge Organizations to Prepare for Big Data Revolution in Information Security” (google search). I keep telling people that security is a big data problem because it it’s the ‘classical’ big data classification of VVV and maybe another V:

– Volume:- Lots of data coming from the fire hose, logs, events, etc..

– Velocity:- – just consider the EMC RSA product Netwitness that captures every network packet in and out of an organisation

– Variety:- Every place you take information from will provide it in a different shape, size and flavour.

Then the ‘Big Data’ techniques of matching, linking and modelling allow us to bring the data together, look for known patterns, apply heuristics to look for suspicious behaviours… and wallah – Security is a Big Data Problem.

The brief is worth reading, (here), as it outlines an approach to preparing to include a ‘Big Data’ approach to your ‘intelligence-driven’ security program:-

  1. Set a holistic cyber-security strategy
  2. Establish a shared data architecture for security information
  3. Migrate from point products to a unified security architecture
  4. Look for open and scalable Big Data security tools
  5. Strengthen the SOC’s data science skills
  6. Leverage external threat intelligence

Big Data is changing the way we live, work and play!

The Information Factory

@chuckhollis posted a blog recently which equates IT to a modern factory, (here). This is great, as some of you know,  I am the analogy king and I’ve been thinking about this analogy for a while.. and Chuck has beat me to it!factory

Chuck talks about the functions of a modern factory and how they relate to modern IT, such as:- “demand forecasting, process optimization, supply chain optimization — and, yes, product quality”. It’s a great article and traces the parallels between them, however I was thinking about a couple of issues that he only touches on.

Agility:- is the catch cry of the Cloud discussion- however what do we mean by agility? Chuck talks about agility in terms of scale; get bigger or smaller very quickly!  However, how about agility in terms of ‘flexible manufacturing’, where a factory has tooling that allows it to be re-configured to produce different products.  By using robots and CNC machines, (Numerically controlled machines), you can essentially re-program a factory to produce a different product, for example, cut the pieces of wood to create different chairs, or a table! (Virtualising the factory!)

Process Optimisation:- Here building on from Chuck’s solid foundation, I think we can extend the analogy. One of the optimisation techniques in the earlier factories was ‘worker activity optimisation’, where an expert would watch the workers on the line and remove redundancy, double-handling, inefficient movements, etc. The idea was that by watching what was being done, you could recognise the patterns and optimise around these.

Consider, if you could watch everything that everyone you work with does each day. You would notice massive inefficiencies, (in the flow of information, the way tasks are performed, etc.). Now a few small changes would create massive productivity gains!

Move down a level; these people use systems to perform these various different functions. These sophisticated systems provide alternative ways to produce the desired outcome.  If we could watch how each person does, essentially the same task, you would find ways that are better than others… and with a bit of training everyone could use the best practice, more productivity.

And lastly down another level; those systems are essentially software running code. Imagine if you could monitor the flow through the code as everyone uses it? We could understand how to make the code more efficient, we could see what never gets used, we would see the path to a bug, etc. Now think of the impact of having this meta-data about the running system… and the optimisation that could be done!

This, to me, is one of the really exciting aspects of Big Data. And if you are thinking this is very “Blue-sky” thinking… well not as far out as you might believe, (for example there is talk of doing this process optimisation, by monitoring the case-flow data that can be obtained from Documentum xCP and using Greenplum analytics to optimise this.)

Forget BYOD how about BYOA?

I’ve been thinking about whether a BYOA is a viable model for the new world?

Back before the corporate PC world was dominated by Microsoft Office, there were different options for people to do their word-processing and spread-sheet modelling on.  This situation had two main advantages; people would carry their application skills with them to new tasks, as well as it spurned competition between the application vendors, (Lotus vs. Excel, and Wordperfect vs. Word).

If we take this idea into an open-data – standards based world, it makes things interesting. For example an application vendor becomes a master at their application logic or functionality, with no underlying ‘data-model’, to limit upgradability or tie people in! (Different to the traditional computer-scientist approach of defining the data-model and then build the logic on-top.)

The user can select a powerful tool and become proficient with it. Then this skill gets applied to solve a number of different problems. For example, say I have the ultimate analytics application and want to understand EMC’s position in the market in order to develop a plan. What be useful is to understand EMC’s results, the perception of the company, competitive landscape, etc. Now I believe the truth is out there as I have access to market data, (future case),  from the governing bodies, (import data, annual reports, etc.), market research from the analysts,  social media feeds and multiple internal data sources – sales force automation, sales transaction data, helpdesk activity, service reports, etc.

In this scenario I could first I point my application at the market data and understand our market share and trends, then add social media and understand the sentiment, (perhaps correlate this to our sales performance). Lastly why not pull in the helpdesk and service information and see what trends I can pick up to help me plan for the year!

Next I get involved in a marketing campaign and I’m asked for a ‘head-line’ message for a campaign. Now I point my analytics application at the headlines to understand what is resonating in my audience’s minds right now.

Or lets go the other way to the ‘non-technical’ user.. the manager we all know who has problems booting up their laptop in the morning. How about they have a great application which shows them the weather in detail… well how about if we could just replace the data with their sales forecast! After-all a bad forecast is a bad forecast!

Well, maybe that is a simpler way ahead and more productive, what do you think?

One of my predictions for 2013 was that “real-time” analytics would become mainstream! Here we go.. a major step forward.

Andy Sitison

SAP Announced today the ability to run SAP Business Suite on top of their HANA in-memory platform. Like the fall of the Berlin wall, there is little now that stands between the real-time analytics functions and what will be lighting fast operational business processes running on the same state of the art platform. I have been talking in my blogs for a while… that analytic processes aren’t complete unless they kick off operational “actions” that follow through on newly generated insights. With this announcement, SAP takes the industry a step closer to that future, today.

Unlike Oracle, who has built an iron forest of Oracle specific appliances, SAP has enabled the eco-system to provide a “HANA economy”. The consumer has options for supporting products that resemble the assets that run in their current data centers (which is smart…) and there is a legion of well trained Global Change Agents (

View original post 277 more words

Happy 30th TCP/IP and are you going to see Tim?

The enabling technology for the Internet turns 30 (here), and the man who devised “the simple link” is coming to town!

tcp-ip

Have you seen the ‘Butterfly Effect’ movies? The idea is that there are moments in your life, when your decisions set the future course. Maybe a bit out there, but there are times in my career where I think this has happened.  For example when I left university I was lucky to be employed by one of the most innovative companies of the time, who designed test and measurement technologies, a company called Hewlett Packard.

Being an electronics engineer and having a keen interested in computers, I began supporting HP’s computerised data acquisition set of products and in particular an inter-connect called HP-IB, (or IEEE 488 for the die-hards). Long story short HP started moving into a new standard protocol and in the mid 1980’s I was sent off to be trained up on this new-fangled protocol called TCP/IP.

This week TCP/IP turns 30! (A stark contrast to SNA, Token Ring, etc.) How many technologies have lasted this long? Before you scream at me that Ethernet today is a little different to the 10base2 I originally learnt about and that the protocol has evolved, I understand that, but how many people use floppy disks? (Even the CD and DVD as data storage media are pretty much dead!)

Well Happy Birthday to TCP/IP, however, probably a more important innovation, was the way to simply link a client to a server, the hypertext transfer, which took the fun out of dial-up bulletin boards!!  Sir Tim Berners-Lee is credited with this innovation and the creation of the www in 1989. Since then he has not rested on his laurels and has been a major driver of the ‘open data’ initiative, and he is coming to town, (Here – only Canberra has space). He is visiting Canberra, Melbourne and Sydney at the end of the month, so if you are lucky enough to catch Sir Tim Berners-Lee, PLEASE let us know what he had to say!!