Cisco Live 2018 – Day 1

cisco live s 2

Today I kicked off my Cisco Live 2018 conference in Barcelona. This is my first visit to the city and my first Cisco Live event which I am very excited about. Especially after I’ve carried out a lot of research in the build-up to this event by looking back at previous years to learn what to expect while attending.

Within my role as Senior Professional Services Consultant at Concorde Technology Group, I act as the technical lead for Cloud and Security. With this in mind, my focus for the week ahead will be based around these Cisco technologies and I have booked in a number of sessions throughout the week, which cover both of these topics (more blogs to come in the next few days).

My focus today, however, was to explore what else the event has to offer (which is a lot), including some hands-on labs, and visit some of the different Cisco booths including meeting the engineers and the very impressive DevNet Sandbox team!

After the very easy registration process, I decided to take a trip to the “Hub” area to explore what it had to offer. Being a very hands-on type of person I was itching to do some labs, so I went across to the DevNet Sandbox team. The DevNet Sandbox offers a host of different free labs that customers and partners can take part in while in a sandboxed environment. The subject matter is also very impressive and includes multiple Networking, Cloud and Security sandboxed environments. I spent about 1 hour with the DevNet team going through the process of accessing the material and I was really impressed with the content and the skill set it covers. Regardless if you are new to the subject or very technical and need to test something, the content covers it all.

For more information, I would recommend visiting developer.cisco.com/sandbox and have a play with the different sandboxed labs that are available.

Speaking to the DevNet Sandbox team made me, even more, itchier for some hands-on labs so I headed over to the ‘Walk in hands-on labs’. Again the subject matter covered a lot of different aspects including Cloud, Security and IOT (to mention a few). One thing on my Cisco Live bucket list was to set up and demo Cisco Umbrella. For those of you reading that are unaware of this product, Cisco Umbrella uses the Internet’s infrastructure to block a malicious destination before a connection is ever established and delivers security from the cloud.

The lab I took part in included setting up this platform for an internal network, Integrating with Active Directory, configuring and installing roaming clients (both AnyConnect and Umbrella) and customising different policies. Firstly, setting up the portal and adding my internal network to use the service took minutes which was a big surprise. All I needed to do was add my public IP address to a network, and update the DNS forwarders on my Domain controllers to point to Open DNS. The only part it didn’t cover in the lab was a requirement for opening ports 443 if you have a firewall, as in the lab environment was already open.

Once my network was working with Cisco Umbrella I was then able to configure the Active Directory integration. This allowed Umbrella to listen to the Domain controllers for user and computer logins via Security Event log, as well as enabling IP-to-user and IP-to-Computer mappings for the reporting which is a feature a lot of customers require when it comes to reporting. The roaming clients feature was also very impressive as it enables you to protect machines that are not on your network. You are able to use both Cisco AnyConnect (if you already use this) or the Umbrella Roaming client (If you don’t use AnyConnect) and install this on devices that are going to be away from your internal network. I was really impressed at the ease of deploying this, which included a ready to download script which can be deployed via Group Policy.

The final part of the lab was configuring the policies which dictate the security protection settings as well as block/allow lists and policies that control log levels and block splash page. All in all, I found the Cisco Umbrella product very user-friendly and a comprehensive tool that will protect your network and Infrastructure against malware and virus attacks from outside your network. The one drawback is that this product does not cover email scanning, but does include some level of SMTP protection.
The final part of my day included a visit to the Cisco Live library for some last minute cramming for my CCNA Routing & Switching exam…but you will need to wait for another blog later this week to hear about that.

If you want to keep up to date with the events as they unfold here at Cisco Live feel free to follow my twitter feed @shabazdarr as well as Concorde’s twitter feed @ConcordeTG.

Shabaz Darr

Author: Shabaz Darr, Senior Professional Services Consultant at Concorde Technology Group

 

KRACK Attack – WIFI vulnerability – What does it mean to you?

WiFi router

You may have seen in the press that a vulnerability has been identified in the WPA2 Wireless encryption protocol. So what is this vulnerability and what does it mean to you?

Security researchers have discovered a number of vulnerabilities in the WPA2 (WI-FI Protected Access II) protocol. These vulnerabilities may allow attackers gain access to private transmitted data traversing your wireless network.

KRACK, Key Reinstallation Attack, has been able to demonstrate the ability to unencrypt wireless communication on multiple platforms, including Windows O/S, Apple IOS, Android and Linux.

So far the following protocols are vulnerable to the attack:
• WPA
• WPA II
• WPA-TKIP Cipher
• AES-CCMP
• GCMP

The flaw is not in the cryptography underlying WPA2 or its predecessor, WPA. Rather, it’s in the implementation. When communicating with a client device to initiate a Wi-Fi connection, the router sends a one-time cryptographic key to the device. That key is unique to that connection, and that device. This is so that a second device on the same Wi-Fi network can’t intercept and read the traffic to and from the first device to the router, even though both devices are signed into the same Wi-Fi network.

The problem is that that one-time key can be transmitted more than one time. To minimise connection problems, the WPA and WPA2 standards let the router transmit the one-time key as many as three times if it does not receive an acknowledgement from the client device that the one-time key was received.

Because of that, an attacker within Wi-Fi range can capture the one-time key, and, in some instances, even force the client device to connect to the attacker’s bogus Wi-Fi network. The attacker can use the one-time key to decrypt much of the traffic passing between the client device and the router.
So what does this mean to you

Many vendors have already issued patches to mitigate this security vulnerability. Users are recommended to update/apply patches to their WI-FI enabled equipment. This includes routers, user devices and smartphones.

Contact Concorde Cyber Security on 03331 300600 or email enquiries@tctg.co.uk for more information on how you can protect your business from the latest vulnerability!

 

Author: Carl McDade, Concorde Solutions Architect

How can Concorde Cloud Solutions help with your data protection strategy?

 

Backup and Disaster Recovery has always been a consideration for all customers when it comes to IT. With the evolving security landscape and the recent high profile ransomware attacks, the ability to provide availability of data is becoming ever more important. Long gone have the days of daily backups and taking untested tapes home.

With all that in mind, having a strong data protection strategy is key to any business. Understanding how you can protect your data, where that data sits, how that data is protected and how that data can be retrieved are all important factors in building this strategy.

So, how can you protect your data?

Protection of data isn’t just backing it up, it’s making that data available. Offline backup windows are becoming non-existent, protection of that data is required 24/7 as businesses look to drive down their RPO and RTO.

Using Veeam Backup and Replication, you have the ability to protect your critical workloads and data throughout normal operating hours. Veeam’s ecosystem of technology partners allows the protection engine to leverage vendor API’s to efficiently protect those workloads. This integration with market leading vendors, allows the protection of data to occur as often as you would like, whilst minimising the impact to the production workloads

Using this technology, you can develop your strategy to meet the RPO/RTO set by your business for each application. If you have a tier 1 application, which cannot suffer more than 15 minutes’ data loss, Veeam can help you meet that. If you have a tier 3 application which cannot suffer more than 4 hours’ data loss, you can tailor Veeam to meet that too.

Where can that data be located?

Although the data protection schedule is important, developing where that data sits is equally important. If part of your strategy is the protection of your primary data centre, it’s no good only having two copies of your data at that same site.

Veeam have put together a 3-2-1 rule to help customers adhere to best practices. 3-2-1 equates to 3 copies of your data, 2 different media types, and 1 copy offsite.

carl image 1

3 copies of your data include the original data set. So this rule encourages you to have two additional backups of your original data to ensure that data can be available in event of failure. 2 different media types ensure that this data isn’t all stored on the same type of device. Having different media sets, which can all be managed by Veeam, reduces the risk of failure. 1 copy offsite provides primary site protection.

How can you get your data offsite?

Getting data offsite has always been the sticking point in historical data protection strategies. This element may have been reliant on individuals taking data offsite, or expensive storage and collection fees.
Back in 2014, Veeam released the Cloud Connect feature as part of their V8 update to Veeam Backup and Replication. This feature was a catalyst for businesses to extend their data availability strategy to Veeam partners offering Backup-as-a-Service. This model was heavily consumed by large numbers and three years on, the Veeam Cloud Connect possibilities are growing with the current iteration, Veeam Backup and Replication V9.5

Veeam Cloud Connect allows customers to connect to cloud based repositories to store backup data offsite. This connection is completed over the internet with no requirement for site-to-site VPN’s or any direct communications. This connection is encrypted over the internet to your service provider and secured via login credentials.
Veeam Cloud Connect allows backups to be removed off site under the same management console as primary backups. Schedules can be created to ensure that offsite data completed automatically, and then provide reports on the success of the job.

carl image 2

 

Concorde Cloud Solutions are offering free trials for this technology to all customers. If you’d like to learn more about how Veeam Cloud Connect can help you with your Data Protection Strategy please contact us on 03331 300600 or email enquiries@tctg.co.uk.

Author: Carl Mcdade, Concorde Solutions Architect

 

Concorde Cloud Solutions

Internet user growth is booming—3 billion people on social media alone

 

AdobeStock_104631331.jpeg

If you thought you couldn’t handle any more social media platforms or friend updates on your Instagram feed, you’re not alone. In a collaboration between We Are Social and Hootsuite, a new Global Digital Snapshot shows that the number of people using social media around the world has just passed 3 billion. Mashable reports that this is about 40% of the global population.

Some other interesting numbers for the August 2017 findings show that there are more than 5 billion unique mobile numbers, and 2.7 billion mobile social users. This means much of social media interaction is done on mobile.

The Next Web writes that growth trends for social media are rapidly increasing as well—growing at a rate of one million new users per day over the last quarter. The top categories of apps that users gravitate towards are communication apps, content, games, travel, and shopping. Other top app themes include food, fitness, and finance.

Social media is just one chunk of larger, overall Internet trends. Cisco contributes to discovering the latest phases of digital transformation by predicting traffic projections in their Visual Networking Index (VNI). With these predictions, we can easily see how massive and disruptive the Internet will be in the years to come.

In the most recent VNI report, Cisco found that 58% of the global population will be Internet users by 2021, and that the average Internet user will generate 61GB of Internet traffic per month by the same year. That is a huge 155% increase of data from 2016. Moreover, Internet video traffic—business and consumer—will be 80% of all Internet traffic by 2021.

Because so much of social media is done on mobile, there are also big predictions for the mobility market. Cisco forecasts that global mobile data traffic will increase sevenfold from 2016 to 2021, and that there will be 5.5 billion mobile users in 2021 (which is up from 4.9 billion in 2016).

To learn more about Cisco VNI and to check out more Internet and mobility predictions, click here.

Author: Stephanie Chan, Editorial and Video Producer at Cisco

Used with the permission of http://thenetwork.cisco.com/.

 

Want to Be a Data Visionary? Change the Conversation

AdobeStock_121483111.jpeg

What do customers really want? What do they actually need?

If you’re like me, you’ve been trying to answer these questions every day for pretty much your entire professional career. Every conversation you have with a customer is an exercise in peeling the onion—listening to them, trying to understand their unique problems, and eventually getting to the core issues that they are looking for you to solve.

I’ll give you an example. How many times have you heard a customer ask, “Is the cloud right for me?” As IT professionals, we know that the cloud is great. It has a lot of potential, and it can be an extremely valuable tool in developing and bringing solutions to market. And because it’s the shiny new toy in the market, everyone is clamouring to find out how they can use the cloud to do things better than their competition. But as time and experience have shown, we also know that it’s not right for everyone (or everything). So how do we approach this conversation?

Here’s an idea: listen to your customers. They will tell you exactly what they need if you give them the chance. But there’s a twist: you have to ask the right questions.

The world of IT has changed. Customers don’t care about infrastructure and systems anymore. What they care about is their data. They want flexibility, choice, security, and control at a cost that works for their budgets—they couldn’t care less what the underlying storage looks like. They don’t want to hear a load of technology terms thrown at them. Because we’re now talking to CxOs, we’ve got to learn how to speak their language. These people care about business value. What are the outcomes? How is what you’re selling going to help them grow their business?

When you change the conversation to talk about data, you’ll start to see the lights come on. You don’t even need to mention NetApp (or any vendor or technology name for that matter). It’s about asking the right questions. What do you want to do with your data? How do you want to use that data to help you grow as a business? What type of data are you collecting? You’d be surprised what you can find out when you keep the conversation focused on them and their data requirements.

In my “Is the cloud right for me?” example, my customer was looking at modernizing its ERP application. Instead of going back and forth between going all-in with cloud or keeping it on-prem like countless other vendors had done, we started by asking them about their data and how they want to use it. Turns out their primary concerns were pretty standard: governance, security, performance, and quality of service. But none of the proposals that had been put forward were ideal for what the customer was trying to do. That’s because the other vendors had been trying to sell the customer on something they didn’t need, based on a conversation that didn’t focus on actual data requirements. By positioning a solution and a strategy, not just a new piece of kit, we were able to provide the customer with exactly what they were looking for, without compromising.

Of course, at the end of the day, you’ve still got to have something to sell. The solution that we positioned was ONTAP Cloud, and the strategy is Data Fabric. Without even mentioning NetApp, we were able to figure out what the customer was really looking for and how it was using data. Once we peeled back the layers of the conversation and discovered those key requirements, positioning NetApp solutions was simple and natural because you’re not trying to put a square peg in a round hole.

NetApp gives me the scope to widen that conversation. Whether you’re a reseller or a partner, NetApp enables you to act like a service provider, and to help your customers do the same thing. With NetApp, you’re not just selling disparate pieces of gear: you’re selling an ecosystem, a portfolio, and a strategy that your customer can build on for the future.

By talking about data, you can up-level the conversation from just another “me too” technology bidding war. Put yourself in the customer’s shoes. It may sound like common sense (because it is), but I’m always surprised at how often people forget. NetApp gives you the tools to be a data visionary for your customers. But just because you have the world’s best hammer, it doesn’t mean every customer is a nail. Take the time to listen. Ask the right questions. Be the partner they need you to be. And when you’re finally ready to talk tech, NetApp is here to help.

Author: Mark Carlton, Group Technical Services Manager

Top 4 Questions About the Value of the NetApp Data Fabric

The trouble most people have with understanding Data Fabric is that it’s not a product that you can just go out and buy. It’s NetApp’s answer to the future of IT. It’s a way of using a wide portfolio of products to enable continuous data availability across multiple on-premises and cloud platforms.

But the real value of data fabric is it provides a platform for transforming your business

While it’s not as simple or easily measurable as just expanding your bottom line, the real value of a Data Fabric is its power to transform your business.

I typically hear four questions about the value of a Data Fabric:

  1. How can it change how I utilise my infrastructure?
  2. How can it help me use my resources better?
  3. How can it help me use my data more efficiently?
  4. How can it help my business make money?

How can Data Fabric change how I utilise my infrastructure?

Whether you’re an existing NetApp customer with a data centre full of NetApp kit or not, the NetApp Data Fabric can help you get more out of your IT infrastructure.

Let’s say your business has a new requirement to provide backup, test and development in the cloud, but you don’t want to have a large admin team to manage all the different tools or equipment required to deliver this solution. So you need to make sure the solution is easy to manage, with full choice and control over your data.

You can build a data fabric to address these challenges and I don’t mean by some “one-size-fits-all” compromise either. I can think of three data fabric components that we can use to meet our needs: FlexArray, ONTAP Cloud, and AltaVault.

FlexArray would provide you with the capabilities to sweat the assets you already have, so you wouldn’t need to replace all your existing storage. In fact, if you wanted to keep it, you could use FlexArray to repurpose it to run ONTAP. This gives your existing storage and all the efficiency benefits of ONTAP

ONTAP Cloud now thinks about having on premises efficiency and control but in the cloud. With ONTAP Cloud you are able to replicate data from your onsite ONTAP array out into AWS or Azure. In an instance, it can provide a test and Dev environment without having to pay for hardware and enables you to operationally scale.

AltaVault provides you with end-to-end efficiency and security when moving data to the cloud. It supports all leading backup and archive software, giving you flexibility and choice to fit it into your existing infrastructure without compromise. It can be deployed as a physical, virtual, or cloud-based solution. In less than 30 minutes, you can be backing up your data from any of your on-premises environments to the cloud of your choice.

How can Data Fabric help me use my resources better?

The Data Fabric gives you choice without sacrificing control of your data. This is key to a successful IT strategy. Forget about trying to predict what you’re going to do in 3-5 years. Think about how your decisions can change your business today. With NetApp Data Fabric and the technologies that enable it, you can buy for what you need today and scale for what you need tomorrow. Your infrastructure is agile and adaptable to your dynamic business requirements.

How can Data Fabric help me use my data more efficiently?

ONTAP 9

ONTAP 9 is the pinnacle of NetApp’s quarter century of innovation and is at the very heart of NetApp’s data fabric strategy.

NetApp continues to build capabilities into the platform to ensure that your key data assets are not only stored efficiently but are highly available, protected and secured.

However, the true power of ONTAP is in its flexibility, the ability to not only run ONTAP on “traditional” physical controllers, but also as a software defined option with ONTAP Select or in the public cloud with ONTAP cloud, means not only can ONTAP allow us to seamlessly move data between storage tiers and controllers, but between virtual appliances and cloud providers to. All of this while maintaining all the same capabilities you expect on-premises meaning your data management, protection, security and analytics tools work in exactly the same way, regardless of ONTAP’s location.

Add to that NetApp’s desire to allow the ability to mirror data between any platform in its portfolio via SnapMirror to Anywhere technology, then you can see how your data fabric can take shape.

How can Data Fabric help my business make money?

A good portion of our IT budgets is probably spent just keeping the lights on. How much do you actually spend on development that moves the business forward?

A couple of months ago, a customer approached me to build an infrastructure that gives them the ability to run their business for peak workloads during heavy sales periods during the last three months of the year.

They wanted a virtualisation environment with a storage platform to run the required 200 servers during these peak times. The rest of the year, the environment runs at 50% of the peak workload (only having to run 100 servers). If this was a fixed, CapEx-based infrastructure, they would have unused equipment sitting around for most of the year. Over a three-year contract, that’s 27 months of wasted investment.

With a Data Fabric, we allowed them to achieve the same capabilities at a much lower cost. We started by deploying a virtualized flash platform on premises to account for standard workload and capacity requirements. While that flash platform may be able to cope with some of the burst that’s required as the business ramps up to its busy time, that’s not the only requirement. Compute and possibly additional storage may be needed for the extra 100 VMs.

A Data Fabric allowed us to use a hybrid cloud solution to address this challenge. By using ONTAP Cloud, we could seamlessly move data between the on-premises kit and either AWS or Azure.

Our fabric strategy also had the flexibility, if needed, to use a NetApp Private Storage (NPS) solution, allowing you to keep your data on your own NetApp systems for constant, guaranteed performance, whilst using your choice of public cloud providers for computing. This solution gives you the ability to scale up or down on demand and only pay for what you need when you need it, saving you that capital expenditure.

If you’ve been asking yourself, “What does Data Fabric mean for me and my business?” you’re not alone. Data Fabric is NetApp’s vision for the future of IT, and the benefits to your business both now and in the future are unmatched in the industry. I have spoken to a lot of customers over the past year and one thing I have learned is that the Data Fabric can help you solve your business challenges today and in the future so…

What are you waiting for?

 

Author: Mark Carlton, Group Technical Services Manager

Understand how GDPR could affect your business

Connexion Internet

I have been asked in a number of meetings over the past few months “what is GDPR?” and in some cases “What do I have to buy?”

But let’s get one thing straight from the start GDPR is NOT an IT problem you can’t just buy something and make it go away. This is a common misconception and I thought I would take the time to jot down what I have learned so far and see if it can help you.

The EU General Data Protection Regulation (GDPR) comes into force on 25th May 2018. It applies to all organisations processing personal data of EU residents, the regulation will introduce a new and enforced way that organisations handle data protection. The penalties for non-compliance of GDPR can be up to 20 million euros or 4% of company’s annual turnover. In addition, data subjects get a right to claim for compensation against an organisation under GDPR.

It is important to understand your obligations and to start working towards your compliance requirements. Being ready by 25th May 2018 will be a major undertaking, but the risks of not being prepared for GDPR are too big to ignore.

What are the new requirements?

Privacy by Design – GDPR has introduced formal principles of Privacy by Design into their Regulations which includes reducing your data collection to what you actually require and the retention of this data to gaining clear consent from the consumers to process their data.

Right to Erasure – The current EU data protection Directive already provides a right for consumers to request data deletion. But GDPR extends this regulation to include data that’s been published out to the internet. This is where you hear a second term known as the “right to be forgotten” which extends to keeping your data fully out of the public view and ensuring it is removed from all systems.

Breach Notification – Within 72 hours of a personal data breach been discovered you have to inform the appropriate authorities. This has to also be extended out to the data subjects if the data is classified as “high risk to their rights and freedoms”.

Fines – Now this is where most company’s ears perk up, GDPR introduces fines that can be up to 4% of a company’s global revenue or 20Million Euro – whichever is higher

Data Protections Impact Assessments (DPIA) –  A DPIA is required in high-risk situations, for example where a new technology is being deployed or where a profiling operation is likely to significantly affect a subject.

Data Protection Officer (DPO) – Not all companies have a DPO, but if you don’t I would advise that you assign this duty so someone in your organisation to take proper responsibility for your data protection compliance. Below are the regulation details identifying if you need a DPO.

“DPOs must be appointed in the case of (a) public authorities, (b) organisations that engage in large-scale systematic monitoring, or (c) organisations that engage in large scale processing of sensitive personal data (Art. 37).  If your organisation doesn’t fall into one of these categories, then you do not need to appoint a DPO.”

Consent – GDPR introduces new strict regulations around collecting data, you have to make sure that you are clear and concise when requesting consent from the subject. You have to define what the data is been collected for and make sure that all it is used for. As a controller of data, you are responsible for making sure you have an audit trail of consent for all data collected from a subject. You may as a business need to review how you’re collecting and recording consent and if you need to make any changes to your procedures.

Children data protection – GDPR will bring in special protection for children’s personal data, focused particularly on commercial internet services such as social networking. To put this into context if you collect data about children, then you will need consent from the parent or guardians to process any personal details lawfully. It may have significant implications for your organisation if your business is aimed at children and collects their personal data. All consent has to be again clear and defined when collecting children’s data and your privacy notice must be written in language that children will understand.

Does Brexit mean I have to comply?

There are few of misconceptions around Brexit when it comes to GDPR. The main one being that “Brexit means we don’t have to comply”. This is FALSE! Businesses will still have to adhere to this regulation, this an EU regulation that protects EU citizen’s data. Which means if you hold any details about an EU Citizen you have to make sure you are compliant and have taken the necessary steps regardless of the jurisdiction.

As I said above GDPR comes into force next year 25/5/2018 and we will still be in the EU so don’t burrow your head in the sand.

Now there are a number of other requirements that you may need to meet to comply with EU GDPR, but I am not a legal expert. So please take the time to investigate where you stand in relation to GDPR understand your risks and what data you hold. Attend an event and discuss it further with legal experts to help you start and build your foundations for GDPR.

Author: Mark Carlton, Group Technical Services Manager