All posts by admin

What is data loss prevention?

Business in this modern age is fraught with challenges and potential hazards, most of us are clearly aware of this.  However, there are also a great number of new opportunities and possibilities which have opened up in the last couple of decades, thanks to the influence of the IT sector, of course.  On the “hazard” side of the equation we have an issue like data loss prevention; which, as the value of data increases, becomes even more significant.

So, just what is data loss prevention anyway?  In a nutshell, DLP (also sometimes called data leak prevention, Content Monitoring and Filtering, or information loss prevention) is way of securing data so that it isn’t able to fall into the wrong hands.  The most obvious illustration of this would be found in examples from the corporate world, where the goal is prevent critical / sensitive data from being leaked to competitors or criminals.  However, data loss prevention isn’t always about creating controls to prevent the theft of information, it can also serve to help stop accidental disclosure of data.

The thing is, data loss prevention isn’t limited to any specific type of information; such systems are designed to avoid losing any type of data, whether it is private, public or corporate.  For instance, if we’re talking about something like hospital or financial records, then it’s in the best interests of the host company to avoid letting such information leak out for fear of lawsuits and/or loss of confidence in the institution itself.  Take credit card information, for example – if a business were to accidentally leak this type of customer information to the public it would likely set off a firestorm which might do irreversible damage to the company’s reputation.  

Similarly, DLP solutions are often set up around anything that might be deemed “intellectual property”.  For organizations which focus on and/or primarily supply digital products or services, data loss prevention is absolutely critical if that business wants to remain active in the long run.  In other sectors, the loss of data might simply mean that production will be set back; but this is also very dangerous because (as you well know) there are many factors involved in any manufacturing operation and lost time means lost profits.

As you might have already guessed, one of the best ways to implement a more sound approach to data loss prevention is for businesses to focus on training and/or certification for their IT workers in the subject.  There is no substitute for having experienced people on the “ground floor” of any DLP program.  Often times it’s the small things or details which make all the difference in the world when it comes to preventing information from leaking out, if you’ve got the right personnel (with the right training) at your side, you’ll find that it’s possible to rest much easier at night knowing that there is an effective solution in place.

Data Loss Prevention Methodologies

DLP comes in several “flavors” or categories, if you will.  First off, there are your standard security measures, like antivirus programs and firewalls.  Additionally, intrusion detection systems qualify as a standardized means of enacting data loss prevention.  In each case, the purpose of all three of these tools is to essentially prevent uninvited guests from getting inside your network and stealing data.

Next you have advanced security measures, which tend to be a bit more sophisticated.  For example, subversion can be employed with advanced methods, using “honeypots”, or elements which are isolated from the rest of the infrastructure and set up to appear vulnerable so that the actions of an intruder or virus can be analyzed is a popular technique falling into this category.  However, there are even more advanced forms of security measures which involve things like A.I., machine learning, and even various types of algorithms to constantly scan for abnormal behavior.  Even if these systems don’t immediately catch a would-be problem, they are often invaluable after-the-fact because they can provide crucial information about how an attack was carried out.

There’s also encryption and access control, which are great DLP-based solutions.  Naturally, if data is encrypted, it’s going to be much tougher (or in some cases, extremely impractical) for thieves to do anything useful with said information.  Also, if encryption is accompanied by some type of tracking initiative, then it’s often possible to locate would-be criminals before they’ve even had a chance to extract any value from said data.  Perhaps the single best way of ensuring that data remains protected at a basic level however is to enact access controls.  In essence, access controls are simply protocols which make sure that only certain individuals with clearance are able to view, copy or share certain types of information.  There are multiple ways in which this is carried out; for example, access control might incorporate- rule and regular expression matching, structured data fingerprinting, published lexicons, and/or statistical methods.  

Clearly, data loss prevention is something which should concern everyone in an organization, particularly management and IT.  Data has real intrinsic value, now more than ever, so it’s really in your company’s best interest to protect its critical information at all costs.   Moreover, if you really want to maintain your profits, continue growing and make a difference in your particular industry, consider adopting a wide and varied approach to DLP.

 

The importance of data warehousing in the age of Big Data

In a sense, a data warehouse is the central repository for all the collected and stored information that a business creates and compiles.   Moreover, analysis and reporting has always been a facet of most contemporary data warehousing strategies.  So it’s a bit perplexing (for most people) when it comes time to draw a distinction between what might be categorized as a “data warehousing” vs. “Big Data” strategy.  After all, we’re talking about two approaches to amassing data and analyzing it here.

The rise of Big Data actually refers to the situation that most businesses and organizations find themselves in these days; namely, that data is pouring in so fast that there’s confusion over where to put it.  In other words, traditional data warehousing methods are being seriously strained when it comes to dealing with the constant onslaught of information being accumulated.  Naturally, companies that specialize in dealing with Big Data are jumping at the opportunity to solve this problem, and we’re seeing quite a bit of growth across the board in this area as well (which is also good for the IT sector in general).  However, this doesn’t mean that data warehousing itself is going the way of the dinosaur; if anything, it’s forcing everyone to reevaluate their systems and look for ways to integrate new tools.

Before we go any further, let’s actually identify what a “data warehouse” actually is.  In short, a data warehouse is:

From Wikipedia:

                            “It is a central repository of data which is created by integrating data from one or more disparate sources. Data   warehouses store current as well as historical data and are used for creating trending reports for senior management reporting such as annual and quarterly comparisons.”

Basically, a data warehouse is a vital component of any business or organization and it is responsible for routinely providing critical insights to individuals in management positions.  Likewise, data warehouses tend to be mostly on-site; meaning, companies set up and run them in a very direct, hands-on manner.  But, as previously stated, problems are popping up because of the influence of Big Data.

So what’s the solution?  In a nutshell, we’re seeing businesses turn toward integrating Big Data solutions into their data warehouse schemes.  This is an especially interesting concept, especially when you consider the fact that more and more people are looking at cloud-based approaches to dealing with Big Data storage and analysis.  Here’s a little snippet from Gartner research on the subject:

                               “In 2012 Gartner recorded a significant increase in inquiries from organizations seeking to deploy data warehouses and analytic data stores for the first time,” the report said. “This might sound incredible over 25 years into the data warehousing era, but we asked these clients several questions to confirm that they were contemplating truly ‘greenfield’ data warehousing initiatives.”

This information pretty much speaks for itself – the demand for 3rd party data warehousing services is definitely on the rise.  So how does Big Data fit into all of this, you ask?  Well, if organizations are interesting in warehousing and analysis, it only follows logic that they would want to take a look at some of the amazing developments in the field of Big Data with regards to cloud-based services.  A great example of this would be Google BigQuery.  In short, BigQuery is an SQL-based analysis and storage solution that offers users extremely fast results as well as hundreds of terabytes worth of storage space.  Additionally, we’re talking about a “metered” service here, so you only pay for what you actually use, so the costs are going to be much lower than any potential investment you would have to make for building and maintaining on-site resources.

Tools like Google’s BigQuery can (and probably should) be used in tandem with a more traditional approach to data warehousing.  In other words, institutions could look at diverting their data overflow to services like BigQuery while at the same time keeping “business-crucial” data on-site, and closer to home.  Conversely, it’s entirely within reason to also assume that some type of integrated approach might also be attempted where everything flows through the data warehouse first and is subverted to cloud-based services used in processing and storing Big Data.

Regardless, this is pretty good news for IT professionals, because it means that there is a growing demand for people who know how to deal with data warehousing and/or Big Data.  In truth, it might be recommended that a person complete some training and certification course in both areas, if for no other reason than become more versatile across a larger career space.

Need guidance? Click here for a great data warehousing toolkit

Click here for complete certification in Big Data

 

 

How knowledge management promotes collaboration & synergy

You know what they say, “two heads are better than one”.  Throughout human history the very survival of our species has often been tied to our ability to think and act in groups.  One might even say that mutually beneficial goals and perks are part of the very life-blood of our entire civilization.  Similarly, knowledge management has a tendency to create centralization and / or a strong sense of “gravity” about how information is collected and used; likewise, it’s something that’s used to bring everyone in an organization together under a common bond or purpose.

Furthermore, there are so many great examples of knowledge management in action which are literally all around us.  Perhaps one of the best examples of how knowledge management can inspire collaboration is the website Wikipedia.  Take this whitepaper written by Christian Wagner of City University of Hong Kong for example, the title is:

      “WIKI: A TECHNOLOGY FOR CONVERSATIONAL KNOWLEDGE MANAGEMENT AND GROUP COLLABORATION”

He’s right, that’s all Wikipedia really is – a unified system which allows the collective users of the site (basically everybody on the internet) to collaborate and improve our shared understanding.  In many ways, Wikipedia is one of the best examples of success when it comes to knowledge management; however, it’s not necessarily the best illustration for businesses who are interested in KM.

When most businesses begin seriously looking at knowledge management, the tendency is to internalize the entire process. The reason for this is simple; they don’t want information and/or insights leaking out to the public or competitors (which is perfectly understandable).  So, whereas something like Wikipedia might be an online database that anyone can tap into, a business’ approach to knowledge management tends to be a little more “sheltered”, to say the least.

Complete Certification in Knowledge Management can be yours at an unbelievably attractive cost, Click here for more information…

In fact, a company’s KM strategy might even be divided into subgroups which either restrict or grant access to certain pools of information.   In other words, maybe only people of a certain rank or title within an organization can access more “sensitive” bits of data.  Of course everyone would still have access to the database at large, but they wouldn’t have sweeping “administrative” powers.  Why take this approach in lieu of simply opening up the project controls to everyone?  By limiting access to certain things you might say that you’re also directing certain people toward others.  For example, if there are two sections main sections / divisions in a database and I prevent most users from accessing the first section, they will naturally delve into the second (because there’s nowhere left to go, of course).

Those in positions of control can use this obvious fact to promote new types of synergy within knowledge management among different departments if they are so inclined.  In other words, it’s a bit like diverting water through a serious of pipes until everything pours into one chamber.   That’s on oversimplification of course, but clearly there are other benefits associated with carefully choosing who you grant universal access to.  Likewise, when people have extra options to choose from there’s a higher chance that confusion will arise.

Most of the time, knowledge management in a business environment is powered by software systems environments which not only maintain the database and oversee its organization, they also promote collaboration and autonomy.  What is autonomy, you ask?  Simply put, it is self-governance; a system that is able to ensure that knowledge is being collected and filed in the most appropriate manner for it to be used later.  One might even argue that autonomous design is among the most important aspects of knowledge management in general and key to the overall success of such a program.  What this often leads to is something called personalization, which allows distinctively different people to interface with one another more efficiently. Let’s face it, not everyone possesses the same knowledge base or skills, through personalization (directed by autonomous knowledge management) people from entirely different departments can effectively work together to achieve some goal.

Similarly, collaboration networks will also often emerge as a result of the influence of knowledge management.  Autonomous software systems will use personalization to create recommendations (also based on other statistics and criteria) for group collaboration.  Needless to say, this is a pretty amazing use of knowledge management and its one that more and more businesses should consider taking a look at.

If used properly, a knowledge management program / system can help you and your organization achieve some pretty amazing things.  Additionally, any business that adopts a dedicated approach to KM will also find that they’re able to increase the speed and efficiency of most of their operations; meaning – higher profits and more satisfied customers.  At the same time, KM promotes collaboration, which can open up new possibilities and/or help create entirely new products, ideas or services. The point is, if you haven’t taken a serious look at knowledge management yet, right now is a great time to do so.

Get certified in Knowledge Management, click here now!

How business continuity planning promotes long-term viability

You know what they say, when it comes to doing smart business one should “expect the unexpected”.  Often times, it is the things that don’t exactly go according to plan that effectively derail an otherwise solid business strategy.  Of course it’s virtually impossible to construct a business model that’s impervious to all risks, but that doesn’t mean that you can’t position your assets in such a way as to mitigate known or unknown factors.  What you should really be after is a basic approach to running your company that allows it to become more “viable” over the long-term.

The act of assessing internal and external threats and generating a strategy for dealing with them in advance is called “business continuity planning”.  Likewise, BCP is essentially divided into two distinct areas – prevention and recovery. Although it’s always a good idea to keep a handle on the status of the many internal elements which are driving your business, these days, it would seem, external factors might be the biggest threat.  For example, just think about the high number of natural disasters we’ve seen emerge over the course of the last few years, many of which absolutely wreaked havoc on those organizations which had running operations in those areas.  Similarly, even if we’re not talking about a full-on environmental problem, there’s potential risk with regards to these same issues taking down remote servers and/or critical 3rd party services. After all, a storm doesn’t have to actually hit your main facility to disrupt your operations; one could just as easily take out your remote data center.

So, how does one begin or start setting up a business continuity plan, you ask?  First off, you’ll need to conduct a risk assessment.  Quite simply, this entails looking at all possible risks and identifying which ones are potentially the most damaging.  As you might expect, if one were to look at every potential risk factor, you could lose a fortune while trying to prep for them all.  The best approach is to identify those risks which could be called the “most critically damaging to the continuity of operations” so that you can focus mainly on them (instead of “spreading yourself too thinly”, so to speak).

Next, you’ll want to identify which services / methods are the most important for the continuing operation of your business. For instance, if your company is reliant on orders that come in from the internet via a number of distinct servers, it would be in your best interest to make sure that there is some type of back-up system or contingency plan in place where those servers are housed and operated.  It’s this type of “thinking ahead” approach that ultimately separates successful businesses from less successful ones.

Once you have your basic risk list ironed out you can start looking at individual issues and taking action.  In any potentially damaging scenario the end goal and result should be the same – to determine how you’ll actually get work done in the event of the unforeseen.  Sometimes this might also include specific directions for your employees, team or board members.  In some cases where you’re not dealing with a breakdown in IT service, you might actually look at using things like social media to “fill in the gaps”.  For example, if some element of your IT infrastructure were to go down so that internal communications were compromised, it would be extremely handy to have an understanding among your employees that certain social media services might temporarily replace it.  In other words, your team members would receive and share their communications / data via their personal devices and private accounts.

Finally, once you have addressed all the critical risk factors and created contingency plans for them, use this as another marketing tool.  In other words, tell your customers and clients about your business continuity planning operations.  Not only will this let them know that you’re thinking ahead, it will also imbue them with a much stronger sense of trust.  In some instances you might actually find that the overall level or number of customers you deal with (on a regular basis) will dramatically increase because of your BCP efforts.

How can you learn more about business continuity planning?  If you’re just discovering this field, or interested in implementing your first business continuity planning project, you’re going to want to engage in some research and/or training. A solid kit or course, like the one offered here, can help you uncover the best strategies for both prevention as well as recovery. Avoiding interruptions to your operations is very important these days, especially when you consider the increasing speed at which modern business is being conducted.  All it might take is a few days to significantly set your company back to a point where either recovery might not be possible or large amounts of profits are lost.  The bottom line is that you should take steps now to prepare so that you, your employees and your business are protected if / when chaos or destruction comes knocking.

Click here for direct access to one of the leading Business Continuity Risk Analysis and Testing Kits on the market.

 

Haptics technology coming to Samsung by way of Immersion corp.

While Samsung has already been implementing basic forms of Haptics technology in some of their mobile products, a recently signed multi-year license agreement with Immersion Corporation will likely propel them forward even further.   As you are no doubt already aware, the mobile market has been flourishing over the course of the last couple of years, and Samsung (who sold 86.6 million units in the first quarter of 2012) has emerged as the pack leader.

Since it’s clear that Samsung is riding atop this wave ahead of everyone else, it’s also apparent that they are the company to watch when it comes to mobile tech.  Moreover, if they’re starting to focus more on Haptics, this might be a signal that new uses for the technology in consumer markets might very well be just around the corner.

Their partner in this arrangement, Immersion corp., is arguably the most well-known and high-profile Company when it comes to the development and distribution of Haptics technology.  But just who is Immersion, and what separates them from all the others?  The organization itself was apparently founded in 1993 and boasts a whopping 1200 related patents which are associated with their globally dispersed products.

Aside from providing Haptics tech for businesses focusing on mobile markets, Immersion is also working alongside those involved in the medical, gaming, automotive, as well as consumer electronics markets.  One of their most recent efforts, called “tactile presence”, was created in order to share “physical” data between two devices.  In short, this means that as you are typing you might be able to feel the sensations of the other person’s fingers.

Naturally, this is all part of a concerted effort to push electronic devices toward being able to convey “emotional” information.  Needless to say, this is all very exciting, especially for consumers.  Since Samsung is clearly interested in breaking new ground with mobile technology, it’s very likely that we’ll see many others following suit.

The question is – what does this mean for the world at large?  Well, it’s entirely possible that Haptics technology is going to transform the IT world in some large or small way in the very near future.  When you really sit down and look at what Haptics is, it becomes clear that we’re really talking about additional data here.  In other words, Haptics is fast becoming another potential source for users to input additional information.  Likewise, this is also an opportunity for organizations to gather critical data about its users in a very streamlined fashion (for whatever purpose).   Perhaps Haptics can be used to determine the emotional states of users in the near future?  In this way, it might be possible to not only develop simple integrated systems for helping people to manage perceived stress, but also to more accurately assess when a person is more apt to make purchases (useful in marketing).

Anyone who works in an IT-related capacity should strongly consider looking at additional training and education materials in the field of Haptics.  After all, we’re really talking about filling in the gap when it comes to sensory data here (the 5 senses – sight, hearing, smell, taste and of course, touch).  The point is, one could make the argument that one of the goals of the technological revolution is to be able to deliver and collect information to and from the 5 human senses.

In this way, Haptics technology puts us one step closer toward realizing this (and others are working smell and taste as we speak).  It’s pretty much basic reasoning; since we’re seeing one of the largest mobile developers moving in the direction of Haptics it’s safe to assume that there could be growth in this area.  Furthermore, growth means that more and more employment opportunities will open up, which in turn implies that more IT professionals with experience may be required.

Pick up this book today!

Haptics: High-Impact Strategies – What you need to know

So, what’s the bottom line?  We’re seeing a very powerful mobile tech company team up with one of the heavy hitters in the world of Haptics, and via a multi-year contract, no less.  It’s safe to assume at this point that these two organizations have some long-term plans on their minds.  According to statements made by key individuals within Immersion, they’re actually interesting in further monetization of their technologies, particularly within the mobile markets.  Similarly, Samsung is always on the lookout for the next big thing, so it’s going to be very interesting to see what they roll out over the course of the next few years.