Internal Combustion Engines, Critical Thinking and Making Good IT Decisions

Germany’s Spiegel magazine has reported that the German Bundesrat (Germany’s federal council that has representatives from all 16 German states) will ban the internal combustion engine beginning in 2030. Consequently, the only way to achieve this goal would be en masse adoption of electric cars to replace today’s cars that are powered almost exclusively by internal combustion engines. This is a bigger issue in Germany than it would be in the United States, since there are significantly more cars per person in Germany than in the US.

Sounds like a good idea, but edicts passed down from senior managers are not always feasible, particularly when those managers might not have done the math to determine if their ideas can actually be implemented by those in the trenches. For example, here’s the math on the Bundesrat’s edict:

  • As of the beginning of 2015, there were 44.4 million cars in Germany. If we assume that the average German car is driven 8,900 miles per year and gets 30 miles to the gallon, each car consumes the equivalent of just under 10 megawatt-hours of electricity per year (based on one gallon of gasoline = 33.7 kWh).
  • Replacing all 44.4 million cars with electric vehicles would require generation of 443.9 terawatt-hours of electricity per year solely for consumption by automobiles (9.998 mWh per car x 44.4 million cars).
  • In 2015, Germany produced 559.2 terawatt-hours of electricity from all sources. That means that Germany would need to produce or import about 79% more electricity during the next 14 years than it does today. However, during the 13-year period from 2002 to 2015, German production of electricity increased by only 12%.
  • If the additional electricity needed for use by cars came from wind generators, it would require 64.5 million square miles of wind farms (based on an average of 93.0 acres per megawatt of electricity generated), an area that is 468 times larger than Germany’s footprint of 137,903 square miles.
  • If the additional energy came from solar, it would require 1.22 million square miles of solar panels (based on an optimistic assumption of 13 watts of electricity generated per square foot), an area about nine times larger than Germany.
  • If the additional energy came from nuclear power, Germany would need to build the equivalent of 13 high-capacity plants (assuming they have the capacity of the largest US nuclear plant, operating at Palo Verde, AZ).
  • Germany could use all of the oil it currently imports for automobiles for the production of electricity, but that would defeat the purpose of switching to electric cars.
  • Consequently, the only logical options to achieve a complete ban on the internal combustion engine by 2030 are a) build lots of new nuclear power plants that will generate the electricity needed for electric cars, or b) reduce driving in Germany by at least 85%. But even the last option would requires substantially greater production of electricity in order to power the additional rail-based and other transportation systems that would be required to transport Germans who are no longer driving cars. Even if we assume the German government would phase in the abolition of the internal combustion engine over, say, 10-15 years following the 2030 deadline, there’s still the problem of producing 79% more electricity between now and 2040-2045.

So, while converting to electric cars is a good idea in theory, in practice it is highly unlikely to happen in the timeframe mandated by the Bundesrat. In short, edicts from senior managers often can’t happen because these managers never did the math or spoke to anyone in the trenches who would be responsible for trying to make it happen.

The point of this post is not to criticize the German government or the notion of reducing the consumption of fossil fuels, but instead to suggest that critical thinking is needed in all facets of life. When someone proposes a new idea, be skeptical until you’ve done the math and thought about the consequences and considered the various ramifications of the proposal. For example, when senior management suggests your company move the email system completely to the cloud, think through all of the potential ramifications of that decision. Are there regulatory obligations we will no longer be able to satisfy? How much will it cost to re-write all of the legacy, email-generating applications on which we currently rely? What will happen to our bandwidth requirements? How will we deal with disaster recovery? How do we manage security? What is the complete cost of managing email in the cloud versus the way we do it now?

Senior managers or boards of directors will sometimes implement policy or make other important decisions without first consulting those who actually need to make it happen. This means that senior management teams, task forces, boards of directors, etc. need to a) stop doing that, b) do the math for any decision they’re considering and c) consult with the people who will be charged with implementing their decisions.

The Future of Computing is 40 Years Ago

The history of computing can be oversimplified as follows:

  • 1950s through the 1970s: Mainframes, in which massive computing and data storage resources were managed remotely in highly controlled data centers. Intelligence and data were highly centralized, accessed through dumb terminals.
  • 1980s through the 1990s: Client-server computing, in which intelligence and data moved to the endpoints of the network as CPU power and storage became dramatically less expensive.
  • 2000s: Cloud computing, in which much of the intelligence and data storage is moving back to highly controlled data centers, but with lots of intelligence and data still at the endpoints.

I believe the fourth major shift in computing will be to revert back to something approaching the mainframe model, in which the vast majority of computing power and data will reside in data centers that are under the tight control of cloud operators using both public and private cloud models.

Smartphones now have more computing power than most PCs did just a few years ago, albeit with much less storage capacity. While the smartphone does not provide corporate users with the form factor necessary to do writing, spreadsheets, presentations, etc. with the same ease that a desktop or laptop computer does, the combination of a smartphone’s CPU horsepower coupled with a monitor and keyboard that serves as a dumb terminal would provide the same experience as a desktop or laptop. As proposed by Robert X. Cringely a couple of years ago, I believe that the corporate PC of the future will be a completely dumb terminal with no Internet connection or local storage. Instead, it will have only a monitor and keyboard and will use the smartphone in the corporate user’s pocket as its CPU and connectivity.

Why? Three reasons:

  • It will be more secure. Data breaches are an unfortunate and increasingly common fact of life for virtually every organization. Many data breaches are the result of simple mistakes, such as laptops being stolen out of cars or left behind at TSA checkpoints, but many data breaches are the result of hacking into on-premises, corporate servers that are insufficiently protected. A review of the most serious data breaches reveals that the vast majority of data breaches have occurred from on-premises servers and other endpoints, not cloud providers. Yahoo!’s recent and massive data breach is more exception than rule, since cloud data centers are typically more secure than those on-premises behind a corporate firewall.
  • It will be cheaper. Instead of providing a laptop and/or desktop computer to individual users, companies will be able to provide a much less expensive dumb terminal to their users that will use a smartphone’s intelligence and computing horsepower to provide the laptop or desktop computing experience transparently. Users will be able to sit down at any dumb terminal, authenticate themselves, and enjoy a laptop or desktop experience. Because storage will be in the cloud, there will be no local storage of data, reducing cost and enhancing security. And, if the dumb terminal is stolen, a company is out only a few hundred dollars, not the millions of dollars for which it might be liable if data is breached from a stolen or otherwise compromised device.
  • It will be more controllable. Instead of users having access to two, three or more computing devices, users can be equipped with just one corporate device, a smartphone, that will enable all of their computing experiences. When the employee leaves the company or loses their device, disabling access to corporate data will be easier and more reliable.

In short, the future of computing will be conceptually similar to what our parents and grandparents experienced: computing intelligence and data storage in some remote, secure location accessed by dumb devices (other than our smartphone).

Best Practices for Dealing With Phishing and Ransomware

We have just published a white paper on phishing and ransomware that we welcome you to download and review. Here are some of the key takeaways from the paper:

  • Both phishing and crypto  ransomware are increasing at the rate of several hundred percent per quarter, a trend that Osterman Research believes will continue for at least the next 18 to 24 months.
  • The vast majority of organizations have been victimized by phishing, ransomware and a variety of security-related attacks during the past 12 months. In fact, phishing and ransomware are among the four leading concerns expressed by security-focused decision makers as discovered by Osterman Research in the survey conducted for this white paper.
  • Security spending will increase significantly in 2017 as organizations realize they need to protect against phishing, ransomware and the growing variety of other threats they face.
  • Most organizations are not seeing improvements in the security solutions they have deployed and in the security practices they follow. While many of these solutions are effective, most are not improving over time, in many cases because internal staff may not have the expertise to improve the performance of these solutions over time. On balance, only two in five of these solutions and practices are considered “excellent”.
  • Security awareness training is a key area for improvement in protecting organizations against phishing and ransomware, since our research found that organizations with well-trained employees are less likely to be infected.
  • There are a variety of best practices that organizations should follow in order to minimize their potential for becoming victims of phishing and ransomware. Among these best practices are implementing security awareness training, deploying systems that can detect and eliminate phishing and ransomware attempts, searching for and remediating security vulnerabilities in corporate systems, maintaining good backups, and using good threat intelligence.

You can download the paper here.

As an aside, I will be attending the Virus Bulletin International Conference next week in Denver and encourage you to do likewise if you’re at all focused on security. I have been to this event before and can vouch for its tremendous value as a place to learn about trends in cyber security and to advance your education about all things security.

Phishing and Ransomware are the Logical Evolution of Cybercrime

Phishing, which can be considered the delivery mechanism for various types of malware and cybercrime attempts; and ransomware, which is a specialized form of malware that is designed for the sole purpose of extorting money from victims, are critical problems that every organization must address and through a variety of means: user education, security solutions, vulnerability analysis, threat intelligence, good backup processes, and even common sense. The good news is that there is much that organizations can do to protect themselves, their data, their employees and their customers.

Phishing, particularly highly targeted forms of phishing like spearphishing and CEO Fraud/Business Email Compromise (BEC), as well as ransomware, are the logical evolution of cybercrime. Because there have been so many data breaches over the past few years that have resulted in the theft of hundreds of millions of records, there is a glut of this information on the market. The result, as there would be in any other business driven by the economics of supply and demand, is that prices for stolen records are dropping precipitously: a leading security firm estimates that the price of a stolen payment-card record has decreased from $25 in 2011 to just $6 in 2016.

Consequently, cybercriminals are turning increasingly to more direct means of theft. For example, ransomware will extort money directly from victims without requiring stolen data to be sold on the open market where it is subject to economic forces that can reduce its value. CEO Fraud/BEC can net hundreds of thousands or millions of dollars in a short period of time by getting victims to wire funds directly.

We are in the process of writing a white paper on phishing and ransomware, and will be publishing the results of an in-depth survey on these problems. Let us know if you have any questions or would like copy of the white paper when it is published next week.

The Department of Labor’s New Overtime Rule

I originally posted this in September 2015, but have updated it now that the new Labor Department overtime rules are going to become effective in the near future.


In March 2014, the president directed the US Department of Labor to update key regulations for white-collar workers who are covered by the overtime and minimum wage standards under the Fair Labor Standards Act (FLSA) Act. In July 2015, a Notice of Proposed Rulemaking was published in the Federal Register for the purpose of soliciting public comments on the rule. The 98-page (!) document is available for review here.

The result of the proposed rule change will be to require employers to pay workers for after-hours activities that they are required to perform, such as checking email, being available to deal with company emergencies, or responding to a manager’s inquiries. Currently, employees who earn more than $23,660 per year (about $11.38 per hour) are exempt from these rules and can be required to work after-hours for no additional overtime pay. The current rule, last updated in 2004, would raise the exemption level to $47,476 (about $22.83 per hour). The new rule will add more than four million additional employees to those already covered.

Here is what I believe will be some of the implications of this new rule:

  • There will be a need to block employee access to a variety of corporate systems for employees whose salaries are below the Labor Department-imposed threshold. These systems include email, SharePoint, CRM systems, corporate social media, corporate instant messaging, VoIP, and any other communication or collaboration system that could possibly be used to respond to a manager’s inquiry, a customer request, a server alert, or that can be used for any type of work activity. One email server vendor, Alt-N, has already implemented a “Do Not Disturb” feature that will allow companies to turn off email during non-working hours so that they can be compliant with the new overtime rules.
  • The alternative, of course, is to simply pay employees for the additional time they work beyond 40 hours each week, but that creates problems that many organizations may not want to address, and it could add dramatically to labor costs. For example, if an employee checks email after work hours, will they be required to log their time spent doing so? Would this include informally checking email if they wake up in the middle of the night?
  • Access control will have to be appropriately linked between HR and IT so that employees who are below the Labor Department-mandated threshold will be prevented from accessing corporate systems during non-work hours. When an employee’s salary reaches the government-mandated level, however, then access can be turned on for these individuals.
  • There will be instances in which an employee whose salary is below the threshold will temporarily be required to work after-hours (such as an administrative assistant covering for his or her manager when he or she is out sick) and so access management capabilities will have to be in place to turn these capabilities on and off quickly to ensure that the employee can fulfill his or her job requirements. This will necessitate a tie-in to HR systems to guarantee that the employee is compensated appropriately for after-hours work.
  • Larger companies will have to maintain even tighter controls to prevent violations of the law for the same employee roles if compensation for these roles differs. For example, according to a customer service representative in New York City makes $60,000 per year and so will have the Labor Department’s permission to access email and other corporate systems after-hours without the need to be paid extra. However, the same job title in Wichita, Kansas makes $40,000 per year and so will not be allowed to do so without receiving overtime pay. What this means is that employees in more expensive labor markets will have freedoms that their counterparts in less expensive labor markets will not have. It also means that employees with more experience and who are paid a higher salary could have access to corporate systems while their less experienced and lower paid counterparts could not.
  • While some employers abuse their employees’ time and expect them to work after-hours for no additional pay or other compensation, there are employees who actually want to work after-hours. For example, some enterprising employees looking to impress their boss or their clients might want to catch up on email before going to bed simply to get a jump on the next day. Some might want to respond to a European or Asian customer’s inquiry in the middle of the night to satisfy that customer as quickly as possible. However, only employees whose salary is above the Labor Department’s threshold will be permitted to do these things on their own time.
  • IT will need to make special accommodations for traveling employees. For example, an employee based in California who travels to Virginia might want to check his or her email at 7:00am local time. However, because his or her email access is restricted until working hours begin in California, accessing email could be impossible until 11:00am local time (8:00am California time) unless the employee has pre-arranged with IT to implement a temporary rule change to accommodate his or her presence on the east coast.

In my opinion, employees should have the right to access corporate systems whenever they want to do so. And employees in Wichita should have the same options available to them as their counterparts in New York, as should less experienced/lower paid employees who work alongside their more experienced/better paid co-workers.

All of that said, it will be essential for employers to be able to turn email and other corporate systems on and off based on this ruling. Not to do so could end up being very expensive.

The Danger of Juxtaposition and Social Media

Henry David Thoreau: “The question is not what you look at, but what you see.”

The law in the United States includes your reputation as a component of your personal property. Just like you’d protect your personal property from damage, so too should you protect your reputation from harm, even harm on social media.

One of the ways that your reputation can be damaged is through juxtaposition, which defines as “the state of being close together or side by side.” For example, in the context of social media, juxtaposition can occur when someone sees offensive content in close proximity to your name, such as a comment to one of your posts on Facebook. If you’re a company and your employees post offensive content AND indicate that you’re their employer, that can harm your corporate reputation simply by being associated with the offender.

This was highlighted for me recently when a friend on Facebook posted some photos about someone burning an American flag and one of her friends responded, “Yes, beat the **** out of him.” Unfortunately, for her employer, she noted in her profile that she’s an assistant vice president for a bank located here in the Northwest. In another example, the Facebook friend of a Facebook friend has posted a long string of very offensive and personally demeaning comments on his Facebook page — and he too took the time to prominently display his employer’s name on his Facebook profile.

Clearly, the employers in these examples did not authorize the juxtaposition of their corporate identity and the offensive content published by their employees. They very likely don’t hold the views that their employees express. Moreover, the vast majority of people will never consciously blame the employer for the offensive views of their employees. But, the juxtaposition of an employer’s identity and stuff that would clearly offend a large proportion of their current or prospective customers has been posted for all the world to see. Like it or not, some people will inadvertently associate that company with that content. Most will not choose to do so, but when some see the company name again, they will remember the offense they took at what they saw.

As an employer, you really can’t control what employees post on their personal social media accounts. However, you can remind employees about the importance of appropriate decorum when using social media, even if it’s their own. You can ask employees not to post your company’s identity on their personal social media profiles. You can have a policy that prevents the use of personal social media using company-owned facilities. And, you can hire people who restrain themselves just a bit before posting to their personal social media accounts, because if they choose to be racially, sexually or politically offensive on their own time, you can bet that it’s probably going to spill over into their behavior as an employee at some point.

The Need to Manage Social Media Properly

Social media is pervasive in the workplace, not only by employees for their personal use, but also for business purposes. For example, 73% of the organizations surveyed for a white paper that we recently published employ Facebook for business reasons, 64% use LinkedIn, and 56% use Twitter, in addition to a variety of other social media platforms. Moreover, a large and growing proportion of organizations use enterprise social media platforms, such Microsoft SharePoint, various Cisco social media tools, Microsoft Yammer, Salesforce Chatter and IBM Connections, among many others.

The use of social media provides a number of important benefits that help organizations to become more efficient, that help users speed the decision-making process, and that allow information sharing in a way that is not possible or practical otherwise. However, the use of social media – whether consumer-focused or enterprise-grade – comes with several risks and costs:

  • The increased likelihood that malicious content can enter an organization through a social media channel. Our research found that 18% of organizations have experienced malware infiltration through social media, although a substantially larger proportion simply don’t know how malware entered.
  • The greater likelihood of breaching sensitive or confidential data, either through inadvertent actions on the part of employees, such as unmanaged sharing of geolocation data, or malicious employee activities.
  • The inability to retain the relevant business records and other information that organizations are obligated to preserve. Our research found that 43% of organizations that have deployed an enterprise social platform do not archive information from it, yet 26% have had to produce content for eDiscovery from the platform.

To address these issues and mitigate the risks associated with the use of social media, every organization that permits social media use (as well as those that permit it but don’t block it) should implement a variety of best practices:

  • Conduct an internal audit of social media use to determine which tools are being used, why they are in use, and the business value that organizations are deriving or potentially can derive from them. The analysis that flows from this audit should also consider the consequences of forbidding certain social media tools, if they decide that’s warranted, including the impact it will have on customer relationships and employee morale.
  • Implement appropriate policies that will address employees’ acceptable use of social media tools, identify which roles in the organization should have rights to specific social media features and functions, and clearly spell out the rights of the organization to monitor, manage and archive social media use and content.
  • Ensure that employees are trained on corporate social media policies and that they are kept up-to-date on policy changes.
  • Deploy the appropriate technologies that will mitigate risks from malware and other threats delivered through social media and corporate social networks.
  • Deploy solutions that will archive business records and other content contained in social media and corporate social networks.
  • Implement an enterprise social media solution that will not only mitigate the risks associated with use of consumer-focused social media tools, but that will also provide enhanced communication, collaboration and information-sharing capabilities.

You can download our most recent white paper on enterprise social media here.

Some Thoughts on IBM Connect

This was my tenth IBM Lotusphere/ConnectED/Connect and, arguably, one of the best. A somewhat new focus, a new venue and a substantial number of people (2,400?) made for a very good event. The expo floor continues to shrink each year, but was still fairly busy most of the times I was there or passed by. Plus, holding the event in a new venue helps to minimize comparisons with past events that had 10,000 or more attendees.

IBM is pushing hard on its social message, integrating social collaboration across every aspect of its offerings: Notes, Domino, Verse, Connections, et al. Even more pronounced was the “cognitive” message – namely applying Watson technology to just about every aspect of the user experience, from identifying those emails that users need to address first to simplifying the calendar experience.

What was interesting is that the keynotes stressed capabilities – communicating more effectively, setting up meetings, and having better access to files – not product names. For example, while I would have expected Verse to take center stage as the hub of the user experience, the name “Verse” was surprisingly underemphasized (at least in the keynotes, although not so much in the breakout sessions). Apparently, according to the IBMers with whom I spoke about this, it was by design. IBM wants to emphasize what people can do, not the tools they use to do it. For example, the company emphasized its dashboard that is automatically populated for each user with content from Verse, Connections and other tools depending on how people work, but minimizes the identity of the specific platforms that host this information.

While I understand the capabilities-not-products approach, I’m not sure the market will agree. Microsoft’s success in the business communication space is attributable, in part, to the fact that it pushes hard on product identity: Exchange, Outlook, Office 365, Yammer and, more recently, Skype for Business. For example, there are many non-IT decision makers that tell IT they want “Outlook” as their corporate email system (when they really mean Exchange), not “the ability to manage email, calendars and tasks from a single thick or thin client interface”. I could be wrong and IBM’s research may indicate that people think in terms of capabilities and not products, but I don’t think so.

Moreover, when comparing Verse to Exchange Online or Gmail, Verse wins hands down in my opinion. The interface in Verse is cleaner, and the integration with Watson to apply analytics to email makes it the superior offering. Yet, many – even in the analyst community – have never heard of Verse. I don’t believe a strategy that deemphasizes the identity of this very good email platform is the right choice.

With regard to Verse, IBM is making headway here, although the company’s policy is not to reveal numbers from its customer base. All of IBM’s several hundred thousand users have been migrated to Verse and there are some useful new features and functions coming down the road. For example, an offline capability will be available at the end of March that will allow access to five days of email and 30 days of calendar (a future version will permit users to adjust the amount of content available offline). Two hundred IBMers are already using offline Verse. Given that the offline version using HTML 5 will suffice for the non-connected experience, there will not be a Verse client anytime soon, if ever. An on-premises version of Verse will be coming later this year. There are other developments to be made available soon, such as the ability to use Gmail and Verse simultaneously in trial accounts, that I will write about when they’re ready.

With regard to other vendors at Connect, I was quite impressed with Trustsphere’s LinksWithin offering that enables analysis of relationships within email, as well as Riva International’s server-side CRM integration capabilities that allow CRM data from a variety of leading platforms to be accessed within Notes, Exchange and other email clients and Webmail.

What You Can Do With “Records”

Some questions about your taxes:

  • Do you file a tax return?
  • Do you make a copy of that tax return?
  • Do you put that copy into a filing cabinet or some other place where you’ll be able to find it quickly?
  • Do you pull that copy out of the filing cabinet and shred it after 30 days instead of keeping it for the next several years?

Hopefully, your answers are Yes, Yes, Yes and No. If that’s the case, you already get the concept of Archiving 1.0 because you’ve a) determined what constitutes an important record, b) you understand the importance of making a copy of it, c) you know that you need to have it readily available in the future, and d) you realize that you have to keep important records for a long time.

That’s where Archiving 1.0 pretty much ends: making copies of important stuff, putting it away for long periods, and being able to find it when needed. But what’s next — what many are calling Archiving 2.0? Consider those multiple years of tax returns for a moment. They include records of your earnings, deductions and other important information that you need to defend yourself in case you’re audited by your tax authority, apply for a loan, or otherwise need to prove how much you earn and deduct each year. But they also contain lots of other information — data on those you support, where you spend your money,  how much you invest, your financial gains and losses, how your income changes each year, charities to which you donate, the amount you pay in property taxes, who you employ to do your taxes, changes in your family structure, and a great deal of other information that would allow someone to understand your decision-making, your success in business, the nature of your key relationships, etc.

Now, think about Archiving 2.0 in the context of your business. Let’s say that you archive just your corporate email. Doing so would preserve all of the business records sent and received through email that you might need to defend yourself to satisfy your Archiving 1.0 obligations. However, here’s what else it would contain:

  • Every customer inquiry delivered through email, who responded to it, the amount of time that it took to respond, the customer’s response in return, and whether or not the inquiry was resolved to the customer’s satisfaction.
  • Every prospect inquiry delivered through email and how it was satisfied (or not).
  • What managers tell employees in email.
  • What employees tell each other in email.
  • How employees deal with sensitive information.
  • Information about rumors that might be spreading in the company.
  • How employees are using corporate email after hours.
  • The recipients of every email and attachment sent through email, including information that was sent to competitors.
  • Information about employees that might be considering or committing fraud.
  • How people in your company interact with one another.
  • The actual management hierarchy in your company that may or may not coincide with your org chart.

This is just the tip of the iceberg in terms of what you might be able to do with this information given the right archiving platform, the right analytics tools, and the ability to sell management on the idea that your information archives contain a wealth of untapped information about your company that won’t be available anywhere else. Now, add in other data types, such as social media posts, instant messages, voicemails, collaborative session discussions, files, etc. and dramatically more information is now available for investigations, analysis of customer interactions, employee behavior, helping employees find the expertise they need, and building better connections between your employees, business partners, customers, prospects and others.

We’re about to produce a white paper on this topic, including an in-depth survey of where organizations are going with Archiving 2.0. We’ll report back on the key findings when they’re available.

Check out StealthChat

Most of the communications we send or receive can be accessed by unauthorized parties: email is typically sent in clear text, voice communications can be intercepted, and instant messages or Facebook Messenger posts are typically not secure. Plus, our communications can live forever on a server or on the recipients’ devices, increasing the potential data leaks or some other form of unauthorized access.

Enter StealthChat, a free service provided by Rockliffe Systems. StealthChat provides a number of important capabilities, including instant/chat messaging, the ability to place VoIP calls, and the ability to share images, all with robust encryption to ensure that unauthorized parties cannot gain access to your content. All content is encrypted both on the device and in transit. Plus, senders can establish a “burn” time for each message, making it disappear a set amount of time after it has been read. Moreover, content sent via StealthChat never gets written to a server, but instead resides only on the senders’ and recipients’ devices.

StealthChat competes with a number of offerings, including WhatsApp, Skype, Facebook Messenger, SnapChat and others. In fact, Rockliffe unofficially calls StealthChat “SnapChat for Professionals”. What sets StealthChat apart is that, unlike some of its competitors, it provides encryption at rest, provides VoIP capability that is as secure as chat messages, and nothing is stored on Rockliff’s or anyone else’s servers.

While much has been made of these types of encrypted, ephemeral communications for illicit activities, including terrorist operations, they have tremendous value for legitimate purposes. For example, traders sending information to one another, healthcare professionals sharing patient information, or business people sending confidential information, all can make use of StealthChat. Of course, any information that should be archived for long periods needs to be sent via more traditional communications channels and should not be sent via StealthChat for obvious reasons, but much of our communications doesn’t fall into this category and can make good use of the security that StealthChat provides.