Make those minutes count
 

The Front-Office Transformation of the Modern CIO

Customer ExperienceTraditional thinking often relegates technology employees – even executives like the CIO and CTO – as back-office positions, supporting the day-to-day operations of a company without directly interfacing with customers or impacting the bottom line. However, as customers move their business interactions to an increasingly online and social media-driven world, CIOs find that the customer experience is increasingly within their purview. The savvy CIO will finds ways to focus resources on improving the customer experience, which can make a big difference across the entire sales funnel.

Reputation Economies

Reputation economies are growing fast in today’s digital and interconnected world, with word of mouth having a direct and proven impact on sales. With low online barriers for both complaints and praise, customers are likely to take to Twitter, Facebook, and other social media platforms to call out outstanding customer experiences – and disappointing ones. In fact, poor word-of-mouth alone is estimated to cost US companies $41 billion each year, and that number doesn’t count the cost of customers who have a bad experience and simply turned away, never completed a purchase, or never returned.

More and more of a customer’s experience with a company is filtered through the online and technological presence of the company. Between 2015 and 2016, for the first time in history, consumers became more likely to make purchases onlinethan in a brick-and-mortar store, and everything from fresh groceries to B2B enterprise data solutions are moving onto the internet. This means that a CIO is positioned to control one of the most important front-office fields within a company.

Managing the Customer Experience

Content management systems (CMS) and customer relationship management (CRM) systems are two of the basic platforms for managing the customer experience. The CRM system monitors and analyzes customer response through a variety of points of contact, from website hits to call center interactions, delivering actionable data to CIO teams. Meanwhile, the CMS tailors a customer’s experience to their needs and preferences.

For example, the CRM for a design software company might track statistics on website hits, and discover that marketing and product information pages are overwhelmingly visited by designers with high-resolution, multi-monitor setups – whereas later in the sales funnel, orders are placed by executives on mobile devices such as tablets, or more middle-of-the-road desktops with monitors. Understanding this breakdown allows a CIO to target each visitor persona with a website experience modeled for their unique environment, leading to positive responses from the customer.

Bringing the CIO and technology staff into the front office may require a hard look at the scope of the position, as well as re-training, clarifying business goals, and auditing existing processes. But the results can bring the power of digital transformation to the company’s growth and sales.

Security Budgets Continue to Soar, But Is It Enough?

SecuritySecurity is now a vital concern for businesses across several industries. However, investments in privacy and defense should have been implemented years ago. With cyber crime now an international epidemic, why have so many companies waited so long to invest in cybersecurity measures? The following sheds some light on whether or not it’s too late to invest in cybersecurity.

Cybersecurity Is an Increasing Concern

Cybersecurity is a growing concern for many businesses, and the number of high-profile breaches continues to grow each year. In 2015, there were approximately 781 data breaches across the U.S. – the second highest year on record for security invasions. According to industry monitors, 40% of those breaches happened entirely in the business sector.

With this in mind, industry experts have predicted mass-scale investments in cybersecurity for 2017. Here are a few statistics based on Business Insidermagazine and other industry publications:

  • An estimated $655 billion will be invested in cybersecurity measures between 2015 and 2020.
  • Nearly $2.77 trillion in security investments was estimated for 2016 – far above the $75.4 billion in spending that took place in 2015.
  • These numbers suggest that businesses are just now catching on to the importance of cybersecurity.

Are Recent Security Investments Enough?

Are these recent security investments enough to combat the rising number of intrusions? According to Radware, companies that are only now investing in cybersecurity protocols are way behind. This is due to new threats that are evolving at rapid rates, so much so that even the latest security applications and programs are not able to contest new strains of malware, adware, and other viruses.

Companies cannot afford to sit around and wait for the next best cybersecurity solution. Industry experts recommend the following:

  • Never procrastinate when it comes to protecting enterprise hardware, software, applications, and general infrastructure.
  • Work with leading vendors to develop a sound and proactive security platform that can combat prior and new threats.
  • Strong security platforms are based on solid foundations; core policies and processes for data availability, integrity, access, and confidentiality must be in place.

 

The Rising Costs of Security

IBM recently estimated that the average cost of security breaches in 2016 was $4 million. This was up from $3.8 million in 2015 – and is slated to grow even more in 2017. With this in mind, businesses have to stop scrambling with last minute endeavors to protect corporate data. They simply need to agree on one comprehensive and cohesive security platform that will prevent massive revenue losses.

The longer businesses wait to implement cybersecurity initiatives, the more susceptible they will be to digital intrusions. It will also be harder for them incorporate security measures in the future, especially if infrastructure has already been jeopardized.

How Learning Sales Can Help IT Teams

t2-jan-blog-2Disruption of emerging trends constantly keeps the IT industry alert — or at least it should in certain cases. Some of the buzz terms that define new developments in technology include Internet of Everything, digital transformation, and microservices.

Meanwhile, the cloud, containers, and the Internet of Things (IoT) appear to be established norms that aren’t going away. Furthermore, many companies aspire to integrate downloadable applications with their services. Here are reasons computer consultants need to balance focus on these developments with sales.

Innovations vs. Distractions

One of the biggest challenges of the tech support industry is to sort between meaningful innovations and marketing distractions. Is it necessary to devote time to every trend, such as the consumerization of IT? It actually depends on the goals, resources, and clientele of each provider, since there are multiple ways to resolve any specific problem.

Making the field more complicated is the niche branding of “as a Service” concepts that have been inspired by the SaaS boom. The question becomes: how much time should firms that market themselves as tech experts spend on learning trends that may have little effect on their markets?

The answer needs to stay close to the organization’s budget and the needs of existing clients. If a technology provider invests too much in new technology, this can drain the budget or lock in clients it cannot efficiently serve. Too much focus on how to manage multiple data streams can lead to diminishing returns, which is why it helps to specialize in certain areas while still offering broad packages.

Many times new technology is redundant and merely introduces new semantics to the industry. AWS EC2 instances, for example, essentially equate to VMware vSphere virtual machines (VMs). Even for the most experienced tech talent, this proliferation of variations can create confusion while draining resources on learning subtle differences in these services. One of the best ways for tech professionals to filter through this cutter is to learn sales.


What Tech Pros Should Know About Sales

Although IT and sales are often considered separate professions, learning sales helps tech experts adopt valuable skills that can enhance their careers by influencing colleagues and technology within the companies they work for. Understanding the sales process gives tech professionals an edge in problem solving when they deal directly with customer needs. It helps them communicate and see through marketing hype better instead of thinking in terms of technical jargon.

Here are basic sales steps that can help tech pros advance their careers by making better decisions for customers:

  1. Set the stage for expectations and resource needs by focusing on solving a problem instead of promoting features.
  2. Master solutions by knowing the differences in when and where to apply them.
  3. Improve consistency and control by applying the solution to a process.
  4. Deliver persuasive presentations that point toward clear and logical decisions.
  5. Move the pitch forward by focusing on the end result.
  6. Emphasize value while weighing costs attributed to time and labor.
  7. Be conscious of time and attitude factors that influence mindset.

Even though there’s an industry stereotype that tech support and sales don’t mix, it’s advantageous for tech support teams to develop sales skills, which can contribute to customer satisfaction as well as enhance their careers. The more skills they can bring to their organizations, the better career opportunities they will have.

Securing the Right Levels of Encryption

EncryptionIn a business environment where workplace collaboration is now considered the norm, how are consumer-focused companies implementing end-to-end security? According to industry experts, many commercial entities are simply emulating the security infrastructures of companies like Apple and WhatsApp.

To combat unsolicited messaging and foreign intrusion, Apple revamped its security infrastructure to protect all its iPhone users and data. Similarly, WhatsApp amended its messaging technologies so that no one could access messages except for end-user clients. These changes have served as models for businesses wishing to incorporate stronger levels of encryption for their communications technologies.

Issues with Encryption

While encryption is now commonplace for collaborative efforts, it is still not easy for companies with cloud-based messaging and communications. This is due to the following obstacles:

  • Cloud technologies are consistently changing and evolving, resulting in newer encryption modules that must be adopted and implemented by subscribers.
  • Cloud-based services are now adding more features, including bots, artificial intelligence, and even third-party integration.
  • The above-mentioned features are simply known as “valued additions”. However, this means that third party vendors will still have full access to user data and content.

To tackle this form of “accepted intrusion”, companies in the cloud are looking for stronger and more durable forms of encryption. In fact, they are seeking codes and programs that will protect user data and transmissions from even recognized vendors and services providers. In an industry that is blanketed with so many forms of encryption, is it possible to secure the right balance between content access and privacy?

Encryption Solutions in a Nutshell

There is no concrete answer to the current encryption dilemma. However, IT experts still play a pivotal role in encrypting codes and establishing access, eligibility, and defense for messaging programs. In other words, companies cannot go either way with encryption; not too insecure, but also not too clamped down. They must collaborate to find common ground and acceptable levels of encryption for all parties involved.

To that end, businesses should use fully locked down end-to-end consumer messaging tools. This means companies can take advantage of existing encryption and security codes without investing in other paid messaging apps.

Enterprise Messaging Providers

While WhatsApp seems to be a plausible solution, it is not the only program in town. Enterprise messaging providers also feature end-to-end encryption databases for all messaging platforms. However, services like Slack and HipChat are designed to be less strict when it comes to recognized intrusion. The latter includes IT involvements, especially during periods of downtime and maintenance. Certain clients may also have access to these internal chat databases, which can seriously impact privacy. With this in mind, user content and data can still be breached, and hackers may easily be able to intrude as well.

Green Tech—The Future of the Data Center

t2-december-2In the past few years, there has been an incredible surge in data center construction around the world. Companies like Microsoft, Facebook, and Amazon are spending huge amounts of capital to build them in places like Singapore, Taiwan, and Tokyo. The reason for this unprecedented growth is the expanding global need for both business and personal connections.

However, the amount of energy used to operate data centers is extreme. According to the U.S. Department of Energy, data centers are “the energy hogs of the computing world,” and a study released in June 2016 found that “US data centers consumed about 70 billion kilowatt-hours of electricity in 2014… representing 2 percent of the country’s total energy consumption… equivalent to the amount consumed by about 6.4 million average American homes that year.”

This type of energy consumption places huge drains on global infrastructures. Therefore, a push to develop energy-efficient data centers is at the forefront of IT concerns.

The Definition of a Green Data Center

Green data centers are those that are designed for maximum performance and efficiency, using a minimal amount of resources. Basically, that means that all of the hardware (the mechanical, electrical, and computing equipment) is arranged and operated in a way that reduces the environmental impact of the data center. There are a number of energy-saving strategies used to reduce consumption in data centers, including:

  • Low emission building materials
  • Water reuse and recycling systems (much water is required for cooling purposes in these industrial-scale facilities)
  • Alternative energy technologies (new cooling systems, photovoltaics, and innovative heating units)

Reducing energy consumption at the data center does more than help our environment; it offers OPEX reductions for the owners.

Current Data Center Condition

Over the last decade, there has been an incredible surge in the need for industrial facilities housing large amounts of server and other hardware equipment. Designed specifically for the needs of electronics, these structures require massive amounts of environmental and security controls. However, their proximity to users does determine certain latency issues. Therefore, the abundance of affordable smart devices and increasing ranges of connectivity, combined with a plethora of new “as-a-service” offerings, has generated high demand for more data centers around the world.

The fact that cloud connectivity presents a number of cost-saving and performance improvement strategies for enterprises has also contributed to data center expansion, and even the number of providers who are “born in the cloud.” According to Gartner, IT is projected to shell out nearly $1 trillion over the next five years transitioning to cloud computing services. That type of infrastructure will depend on more data centers for support.

Green Futures

Data center development has increased, and likewise the energy required for operation. The good news is that the global commitment for developing more green facilities is strong. By investing in conservation and reuse equipment, providers will be able to transfer the savings on to the end user. In addition, although the initial capital expenditure is higher than traditional construction, a green data center delivers measurable ROI and long-term reductions in operating costs.

Understanding VoIP Issues and How to Solve Them

t2-december-1Although Voice over Internet Protocol (VoIP) offers superior quality and service compared to legacy private branch exchange (PBX) systems, situations can still occur that frustrate businesses and customers. Having reliable, clear call service is necessary to maintain a professional image. Dropped or choppy communications generates a bad impression and has the power to reduce revenues.

Fortunately, by knowing the reasons for poor VoIP service, companies can solve those problems swiftly. Following are the most common causes of call quality difficulties, and solutions for eliminating them.

#1. Internet Service Provider (ISP)

Often, dropped calls and persistent sound quality issues are related to the business’s ISP. Many SMBs make the switch to VoIP in order to reduce costs, but fail to calculate the exact impact it will have on their total bandwidth consumption. Other issues include the speed or hardware used. Companies that are still using cable connections rather than fiber-optic service can suffer. Ookla offers a free speed test that can be used to determine current capacity.

Another ISP problem results from having two different providers deliver VoIP and network connectivity. Since call issues can usually be traced back to packet priority, voice transmissions are basically vying for precedence over all other types of data transmissions. So if someone in the office starts a download, call quality suffers.

Solution: Switching to a comprehensive provider that offers hosted phone service in a unified business communication service provides companies with effective packet routing.

#2. Call Interference

Crackly sounds, buzzing, fading in and out, and other disruptions make it difficult for people to communicate. This issue is generally referred to as “jitter,” which is essentially a delay in the reception of voice packets. Although the packets are transmitted in the correct order, evenly spaced, and in a continual stream, they aren’t received in the right order. Causes of jitter include network congestion, unsuitable routing, or faulty configuration.

Solution: Moving to a single provider can resolve these problems; or, companies can increase their bandwidth, place calls above all other traffic (voice receives priority), or overcome the issues by resolving hardware incompatibilities.

#3. Echoes and Delays

When the call sounds like it has been placed inside a cave, the echo heard is the result of latency issues. Voice transmission delays that are longer than 50ms can be discerned by users, and make communications extremely frustrating. This type of propagation delay is irksome, but latency is also a result of improper prioritization.

Solution: Purchasing new hardware, arranging for policy-based network management, and instituting packet prioritization can be accomplished either in-house or by contracting with a service provider.

#4. Dropped Calls and Inconsistent Quality

Companies that suffer from fluctuating VoIP quality and frequent dropped calls present an unprofessional appearance. Although quality problems can be addressed using increased bandwidth, sometimes the issues are a result of inadequate switches, routers, or service.

Solution: Choosing a provider that offers active monitoring and troubleshooting is a good start. However, companies can also check equipment configuration and look at options for simplifying their networks.

As more and more businesses move from legacy PBX to VoIP, the need for

superior service becomes clear. Contracting with a single provider offers network performance solutions that solve many call problems for the modern enterprise.

How Modern Technology Has Affected U.S. Politics

TelecomWhile this election has many people concerned and eagerly anticipating its end, there are certain notable technological advances that have had a positive effect on politics in the U.S. Here are some of the ways in which telecom technology has improved the political process.

Today’s Advances Allow Anyone to Stay Informed
Back in the early 1900s, the only way to make long-distance contact with anyone was via an expensive phone call. If people wanted to learn about what was going on across the nation, they had to be wealthy. Thanks to the availability and affordability of the internet and telecom a century later, people can get immediate access to all of the latest news — political or otherwise.

ARPANET Wasn’t the True Origin of the Internet
Contrary to what many believe, the invention of the Advanced Research Projects Agency Network (ARPANET) in the 1960s wasn’t the actual precursor to the modern-day internet. It was actually the invention of the transatlantic TV cable that initiated the development of what we would later come to know as the internet.

Fiber Optics Paved the Way for Telecom
Apart from the innovations taking place throughout the early-to-mid 20th century, another major influence on today’s internet and telecommunications was the development of fiber optics. Many people associate fiber optics with colorful displays and gifts, but these cables are also capable of holding a lot of information that makes them invaluable in today’s technological advances.

Prior to fiber optic cables, trunk cables were commonplace, and took up a lot more physical space than today’s fiber. Today, hair-thin fiber optic cables can carry the same amount of information as about 300 trunk cable bundles, each with roughly the diameter of a baseball.

Online Voting May Not Be Far Away
The prospect of voting online sounds like an instant security risk, prompting many to mildly panic at the thought of a more easily rigged election thanks to hackers and other threats. On the other hand, shopping online also made many people skeptical when it rose to popularity in the late ’90s and 2000s (when people were stockpiling supplies to prepare for Y2K), but most people today won’t give a one-click Amazon buy a second thought.

Many states already offer online voter registration, and with this development online voting isn’t that much of a stretch. While today’s security capabilities may not be good enough to effectively handle online voting, that future isn’t too distant. While that might not be here yet, today’s technology has already made it easier than ever for voters to gain knowledge about political nominees, including their stances on issues, policies, and even their life stories.

There’s no question that technological developments over the years have revolutionized politics in the U.S., and these changes will only continue as telecom and internet capabilities increase.

How to Determine if Cloud or On-site Video Conferencing Is Ideal

Video ConferencingPrior to purchasing video conferencing services, businesses should consider the differences between cloud and on-premises services to determine which type of service is ideal for them. As video conferencing continues to evolve and become increasingly streamlined over time, costs have decreased and these services have become more accessible to the average user.

There are two main ways for businesses to utilize video conferencing. The first is to keep video conferencing operations on-site with everything located within the workspace, while the second is to outsource it to a third party provider that deploys it as Software as a Service (SaaS) via cloud storage.

What Is Included in Video Conferencing Solutions?

Both cloud-based and on-site video conferencing services include many of the same components, but there are some differences between their capabilities.

  • Recording and Streaming – Both on-premises and cloud-based video conference calls can record and stream conferences, but while on-site services keep data within their own network, cloud services store data on third-party servers. Service level agreements (SLAs) can help businesses determine how data is stored and in what locations. Business owners also need to consider the total cost of ownership as it pertains to data storage and network traffic.
  • Multipoint Conferencing/Bridging – The great advantage of multipoint conferencing, or bridging, is the ability to include a large number of sites on a single call. The number of available users will vary from vendor to vendor, but typical on-site multipoint conferencing will allow anywhere from 12 to 120 users on one call, while cloud services often allow 25 to 50 users, with little to no call limits for the latter.
  • Firewall Configuration – One of the limitations of on-premises video conferencing is the need to setup firewalls prior to making calls beyond the business’s network. Configuring a firewall can be complex, requiring users to take multiple factors into account. Cloud-based services can help companies avoid this issue entirely, allowing them to connect nearly anywhere with internet connectivity.
  • Centralized Network Management – Another advantage of cloud-based services is streamlined network management, with the ability to update software and address books and manage problems with the network through an automated process.

When On-Site Video Conferencing Is Appropriate

On-premises video conferencing was a much more popular choice for organizations when it was the only method available. Today, companies utilizing this type of conferencing the most are large organizations that place security as a top priority, such as government agencies or businesses that handle large amounts of confidential client information.

However, on-site video conferencing also requires a lot of resources that many businesses simply don’t have, particularly smaller companies with minimally staffed IT departments. Unless the business has the ability to handle all of the details surrounding on-premises conferencing, this method isn’t ideal.
When to Choose Cloud-Based Video Conferencing

Cloud technology is constantly evolving, and cloud-based services are becoming less expensive over time. They are highly scalable to meet the needs of small to large businesses, with fixed costs available to maximize predictability of expenses. These services also include plenty of automation to reduce the need for a large and consistently attentive IT team.

When choosing between on-site and cloud-based video conferencing services, businesses should consider their unique requirements.

The Importance of Wireless WAN Connections for Businesses

shutterstock_328634297Wireless internet connectivity continues to be an important asset for businesses, including wide-area network (WAN) connections. According to Enterprise Management Associates (EMA), 96% of distributed companies rely on wireless for WAN connections in certain remote locations.

Even if businesses aren’t necessarily using wireless for all of their remote sites, it still stands that companies depend on this technology for parts of their operations. The continuously developing wireless technology in today’s IT industry keeps it relevant, with public Wi-Fi and LTE/4G among the most commonly used wireless technologies available for a variety of applications.
Benefits of Using Wireless Technology for Enterprises

WAN has used wireless technology for a long time, but in the past it served mostly as backup connections — which is how many businesses are still using it today. In the event of an outage that occurs in their normal wired WAN link, companies can use wireless 3G or 4G radio instead, helping avoid downtime.
Which Industries Use Wireless Technology the Most?

Some of the industries that rely on wireless technology for WAN connections include banking, insurance, and manufacturing. According to the same EMA study, banking and insurance enterprises use 3G and satellite connections more than others. A 3G connection would make sense in these environments because of the low cost for connecting many single-transaction devices at one time, such as ATMs.

At the same time, manufacturers experiencing dramatic growth may need to connect with an increasing number of remote sites in a timely fashion, and 4G/LTE connections offer an effective temporary connection until they can install wired WAN connections.
Increased Security with Wireless

Wireless technology isn’t only crucial for temporary connectivity. EMA also found that many companies turn to wireless for improved security, better performance, and expanded bandwidth. Only 18% of businesses used it simply because of a lack of a wired connection.

LTE/4G also allows for better security, with top-level encryption over multiple layers of security, making it an ideal form of wireless to install around the world.

EMA predicts that wireless technology will only continue to grow in popularity, and businesses will be able to see the benefits when turning to this technology to support a WAN connection.

Are Wi-Fi Speeds at Their Peak?

shutterstock_328634297Since its inception, Wi-Fi has become an integral part of daily life. If people experience issues when trying to establish a connection, many immediately place the blame on Wi-Fi — but the fact is that this technology is quickly becoming more capable than older connection methods.

While many may be impressed by the speed of today’s Wi-Fi, it is still called into question whether Wi-Fi speeds are truly reaching their full potential. There are several factors that affect Wi-Fi speeds.
Failure to Reach Theoretical ThroughputSince the introduction of Wi-Fi to the public along with the IEEE’s 802.11 standards, maximum data rates for this service have surpassed traditional home internet connections. According to Akamai’s State of the Internet report, in 2016 the average connection has reached 15.3mbps — which is still nowhere near the theoretical max.

This isn’t a new realization, however. Hardware almost never approaches theoretical maximum throughput because of several factors, including interference, multiple clients, and many other factors that affect the data transfer rate. Speed is also largely different in Wi-Fi connections compared to traditional wired connections.

Unlike wired connections, wireless connection speeds don’t just involve the rate of data transfer between handsets and access points. In addition to this rate, Wi-Fi speeds account for availability and duty cycles. Higher data rates result in frequencies that aren’t as heavily used while allowing more devices to connect. On the other hand, faster rates also compound the problem of density.
Solving the Problem of DensityAccording to York College IT director Joel Coehoorn, there are three main issues that render those theoretical maximums unachievable:

  • Maximum speeds pertain to products with unrealistic configurations that no user would actually set.
  • Many people don’t understand that access points must be able to support users operating at the lowest data rate on the network.
  • The more people using the connection, the more that bandwidth has to divide among them, lowering each individual’s speed with every added user.

These limitations make it difficult for wireless connections to meet the maximums, even if the technology is significantly better than other, older types of connections.
Technology Continues to EvolveWireless internet connectivity has made plenty of progress over the years, despite a few setbacks. Clients might see higher theoretical maximums (such as with Wave 2 standards that can achieve a throughput of up to 3.47Gbps), but this only means that the connection on the user end will be faster than what it used to be.