Drew Robb, Author at Datamation https://www.datamation.com/author/drew-robb/ Emerging Enterprise Tech Analysis and Products Tue, 30 May 2023 18:30:29 +0000 en-US hourly 1 https://wordpress.org/?v=6.2 How to Use a Knowledge Management System to Improve Customer Service https://www.datamation.com/trends/use-knowledge-management-to-improve-customer-service/ Tue, 30 May 2023 18:24:52 +0000 https://www.datamation.com/?p=24212 A knowledge management system (KM) could be defined as any system that identifies, organizes, stores, and disseminates information within an organization to make it easily accessible and usable. Whether a single, purpose-designed tool or a collection of integrated systems, a knowledge management system can provide value to an organization in a wide variety of ways.

One common business use is to improve customer service. In this context, a knowledge management system makes it easy to provide relevant and personalized information to customers and the staff who support them. This article looks at specific ways a business can use knowledge management systems to improve their customer service.

Eliminate Silos by Sharing Knowledge

A knowledge management system can help a business break down information silos that prevent different parts of the organization from having access to relevant information or being able to see more holistic views of customers and their interactions.

For example, information in the customer database is not available to the analytics system, or management collects sales data that is not made available to front line workers that spend their days contacting customers.

A knowledge management system implemented in a call center or customer service setting can eliminate these information silos using the following best practices:

  • Consolidate knowledge repositories. Implementing systems that make it possible to unify knowledge repositories and databases will help keep all relevant information in a single system accessible by all.
  • Adopt federated search. Consolidating data and providing federated search tools make it possible for front-line staff to search all data sources based on one query.
  • Design systems from the point of service backwards. A customer-first approach will help ensure all customer data is available at each stage of their interaction with the company.

The easier it is for staff to find customer information, the easier it will be for them to provide high quality call responses and overall customer service.

Provide Consistent Information Across Channels

Call centers can no longer rely on a phone line for customer service. In this multi-channel world, customers looking for support expect online knowledge bases, social media access, chat tools, and more. This can pose challenges for organizations looking to provide consistent information that is optimized for viewing across all channels.

Businesses looking to implement knowledge management across multiple channels should:

  • Deliver consistent multi-channel data. Users don’t want to have to repeat themselves by reentering data or explaining their issue multiple times at each stage of their interaction with customer service.
  • Optimize content so it is viewable on any channel. Information might look different on a smartphone than on a web browser, and graphics-intensive sites might provide lousy user experience for low-bandwidth customers.
  • Integrate all channels. Customer service agents should be able to seamlessly move among the different channels to provide a more seamless, unified customer response.

Some people prefer to call, some want to email, others would rather chat or post on social media. A knowledge management system can make it easier to accommodate all customers, regardless of their preference.

Improve Customer Service Responses

Customer service often depends upon a rapid, user-friendly response. Knowledge management systems can facilitate this by making data available rapidly, on a single screen if possible, with drill-down features that make further information available when necessary.

Businesses looking to speed up customer response with knowledge management should:

  • Design systems to answer queries fast. Impatient customers won’t be forgiving of underpowered hardware or glitchy software.
  • Provide a single dashboard or screen. Identify the key information to help serve customers quickly and summarize key customer data on a single, easy-to-read dashboard for customer service representatives.
  • Include comprehensive drill-down features. When a representative needs more information about a customer or transaction, they should be able to get to it from the main screen without going into another system or location.
  • Prevent unnecessary delays. Any additional steps or unnecessary information can result in customer frustration, dropped calls, and customer churn.

Callers expect quick answers based on the correct data. Doing everything possible to provide them with those answers is essential.

Increase Customer Self-Service 

Online knowledge bases may be giving way to artificial intelligence (AI) and chatbots in some cases, but they are not going away—and many of them are poorly designed or outdated. A knowledge management system can be used to help overhaul a business’s online knowledge base with the following steps:

  • Enhance online search. Making it easy for users to find information quickly, without wading through endless documentation, will improve user experience and customer satisfaction.
  • Devise good systems of taxonomy. Identify the information customers want and how they search for it, and then make it easy for those keywords and search terms to provide relevant results.

Customers are comfortable and familiar with online searches, and delivering bite-sized answers in an easy format can help improve their experience.

How to Design a Knowledge Management System for Customer Service

When designing or implementing a knowledge management system for the specific use of customer service, there are a few things to consider that will help ensure a better result.

Include Customer Service Representative Training

Organizations often focus their knowledge management efforts on the customer, but it must be a resource employees can use to better serve customers. When designing the system, incorporate training modules, use the knowledge base as a training aid during calls, and make it easy for representatives to find the data they need.

Without well-trained agents, any knowledge management system will flounder. Ensure the system serves both customers and agents, especially those learning the trade. Knowledgeable agents provide the best service.

Involve Customer Service Representatives in the Design Phase

One of the flaws of software design is that programmers don’t always understand or take the time to discover the needs of system users. When designing or implementing a knowledge management system, make sure that the system meets the needs of those front-line workers who will use it. Gain their input, let them try out the system at various stages in the build, and find metrics that align with their duties.

Integrate Related Systems

Knowledge management, Customer Relationship Management (CRM), contact center and key sales or management systems should not be separate islands within the enterprise. Avoid systems that are difficult or costly to integrate in favor of platforms that can easily fit into existing infrastructure. A centralized knowledge hub should align fully and integrate well with all other key customer facing systems.

Incorporate Automation

Some call centers use automated voice response systems to reduce call volume, but automation can also be used to deliver better customer service. Implementing response chat systems that provide easy call turnovers to customer representatives can prevent long wait times and boost caller satisfaction. Implement chat systems that provide useful answers rapidly, ensure the system knows when to refer the customer to an agent, and provide a call-back option within a specified time.

Add Artificial Intelligence

AI systems like ChatGPT can be introduced into customer service to forward the mission of enhancing overall customer experience. For example, Natural Language Processing (NLP) AI can help interpret user intent rather than expecting users to know the right keywords to get the answer they need. NLP even takes into account industry-specific terminology, different languages, and special content like product names. Self-learning search engines continuously learn from every interaction to deliver increasingly accurate and targeted results.

AI and Chat are big advances, but they are tools and must always be fitted to a definite business purpose if they are to improve the customer experience. Seek out AI tools geared to vertical markets that would be better-suited to the needs of the specific audience.

Bottom Line: Using Knowledge Management Systems to Improve Customer Service

The modern customer is far different from those of even a decade ago. Knowledge management systems must be adjusted to cope with current needs by providing integrated, multi-channel systems that serve data in the format needed by agents and customers. Considering both customer and customer service representative needs when designing and implementing a system can help improve customer service and customer satisfaction while making staff more efficient and more effective.

]]>
Oracle Opens Cloud Region in Chicago https://www.datamation.com/cloud/oracle-opens-cloud-region-chicago/ Fri, 27 Jan 2023 20:25:57 +0000 https://www.datamation.com/?p=23820 Oracle is expanding its cloud infrastructure footprint in the U.S.

The Oracle Cloud Region in Chicago is the company’s fourth in the U.S. and serves the Midwest with infrastructure, applications, and data for optimal performance and latency, according to the company last month.

The company hopes that organizations will migrate mission-critical workloads from their own data centers to Oracle Cloud Infrastructure (OCI).

Focus on Security, Availability, Resiliency

The new Chicago Cloud Region will offer over 100 OCI services and applications, including Oracle Autonomous Database, MySQL HeatWave, OCI Data Science, Oracle Container Engine for Kubernetes, and Oracle Analytics. It will pay particular attention to high availability, data residency, disaster protection, and security.

A zero-trust architecture is used for maximum isolation of tenancies, supported by many integrated security services at no extra charge. Its design allows cloud regions to be deployed within separate secure and isolated realms for different uses to help customers fulfill security and compliance requirements. It has a DISA Impact Level 5 authorization for use by U.S. government organizations to store and process Controlled Unclassified Information (CUI) and National Security Systems (NSS) information.

The Chicago Cloud Region contains at least three fault domains, which are groupings of hardware that form logical data centers for high availability and resilience to hardware and network failures. The region also offers three availability domains connected with a high-performance network. Low-latency networking and high-speed data transfer are designed to attract demanding enterprise customers.

Disaster recovery (DR) capabilities harness other Oracle U.S. cloud regions. In addition, OCI’s distributed cloud solutions, including Dedicated Region and Exadata Cloud@Customer, can assist with applications where data proximity and low latency in specific locations are of critical importance.

Oracle has committed to powering all worldwide Oracle Cloud Regions with 100% renewable energy by 2025. Several Oracle Cloud regions, including regions in North America, are powered by renewable energy.

See more: Oracle Opens Innovation Lab in Chicago Market

“Innovation Hub”

Clay Magouyrk, EVP at OCI, said the Midwest is “a global innovation hub across key industries,” as it’s home to over “20% of the Fortune ‘500,’ 60% of all U.S. manufacturing, and the world’s largest financial derivatives exchange.”

“These industries are increasingly seeking secure cloud services to support their need for high-speed data transfer at ultra-low latency,” Magouyrk said.

Samia Tarraf, Oracle Business Group global lead at Accenture, said the Oracle Cloud Region in Chicago will “provide clients with more choices for their cloud deployments.”

“We look forward to collaborating with Oracle to help organizations in the Midwest more quickly and easily harness the business-changing benefits of the cloud,” Tarraf said.

Oracle in the Growing Cloud Market 

Oracle holds an estimated 2% of the cloud infrastructure market, as of Q3 2022, according to Synergy Research Group. AWS holds the first position at 34%.

Gartner numbers put the size of the annual cloud market at somewhere around $400 billion, with growth rates maintaining a steady climb of about 20% or more per year.

Oracle aims to carve out a larger slice of the pie. It now provides cloud services across 41 commercial and government cloud regions in 22 countries on six continents: Africa, Asia, Australia, Europe, North America, and South America.

“The days of the cloud being a one-size-fits-all proposition are long gone, and Oracle recognizes that its customers want freedom of choice in their cloud deployments,” said Chris Kanaracus, an analyst at IDC.

“By continuing to establish cloud regions at a rapid pace in strategic locations, such as the U.S. Midwest, Oracle is demonstrating a commitment to giving its customers as many options as possible to leverage the cloud on their terms.”

See more: Best Cloud Service Providers & Platforms

]]>
Yahoo Selects AWS Public Cloud for Ad Division https://www.datamation.com/cloud/yahoo-selects-aws-public-cloud-ad-division/ Fri, 27 Jan 2023 20:19:33 +0000 https://www.datamation.com/?p=23819 Yahoo is working with AWS as a preferred public cloud provider for its advertising technology business Yahoo Ad Tech.

All Yahoo advertising technology workloads — including its media-buying and supply-side platforms, analytics, and identity solutions and products — are being switched from Yahoo on-premises data centers to AWS as part of an ongoing digital transformation strategy,  according to AWS last month.

The goal is to reduce IT infrastructure costs, transform advertising operations, and develop tailored and immersive solutions to help brands better connect with audiences.

Improving Ad Performance and Revenue

With a current reach of 540 million people worldwide, Yahoo Ad Tech covers a large inventory of mobile, web, and TV channels.

By moving its central data platform to AWS, then, Yahoo Ad Tech aims to improve advertising effectiveness, personalization, and engagement for its advertiser and publishing customers.

Amazon Elastic Cloud (EC2) compute-optimized instances will provide publishers, advertising agencies, and brand customers with insights on real-time advertising performance. In addition, this platform will reduce the time it takes to deliver the insights advertisers need to reach the right audiences at the right times and in the right formats.

The AWS global infrastructure and portfolio of analytics, compute, machine learning (ML), serverless, and storage capabilities will help Yahoo’s ad-decisioning engines help attract more advertisers and manage their ad business more effectively while facilitating the pace of innovation.

This will show up in ways such as ad measurement capabilities, optimizing real-time ad bidding, and honing ad inventory and effectiveness to determine the best mix of advertising.

Yahoo Using AWS Data Storage, Analytics, and Cloud Training

The AWS-Yahoo Ad Tech deal takes advantage of a number of recent additions and updates to the Amazon service portfolio.

For example, the implementation of the latest version of Amazon Simple Storage Service (Amazon S3) makes it possible to construct a centralized data lake to store hundreds of petabytes of data. Beyond storage at a massive scale, this data lake will enable Yahoo Ad Tech to eliminate internal data silos.

The company can then tap into recently enhanced AWS analytics services, including Amazon EMR, a cloud big data service for processing vast amounts of data using open-source tools, and Amazon Athena, an interactive query service that makes it easy to analyze data. These rapid analysis capabilities heighten the ability to spot advertising trends, target audiences, and deliver ad performance insights.

Amazon SageMaker, for building, training, and deploying ML models in the cloud and at the edge, will also be used to streamline Yahoo Ad Tech’s ML pipeline.

AWS will also help Yahoo Ad Tech via cloud skills training. The company is educating personnel using the AWS Designated Virtual Trainer (DVT) program, which will provide the company with more than 50 instructor-led classes in 2022. The target is for 2,000 IT employees to receive foundational cloud training over the next two years. These classes will help employees develop skills in application development, data management, and security to support Yahoo’s migration goals.

See more: Delta Selects AWS as Preferred Cloud Provider

“Move Faster”

Aaron Lake, SVP of platforms engineering and CIO at Yahoo, said that “by harnessing the power of AWS, we’ll be able to move faster and give our customers what they value most — advertising solutions that provide the right combination of performance, audiences, and revenue growth.”

Running all of Yahoo Ad Tech on AWS gives Yahoo “a broad portfolio of services that will allow us to help advertisers achieve the returns they want by providing them with audience targeting, while our ad publisher customers are able to scale and monetize their ad space,” Lake said.

Matt Garman, SVP of sales, marketing, and global services at AWS, said media and digital advertising companies rely on AWS as their cloud provider “because we help them deliver performance and drive real growth.”

“We’re excited to help Yahoo accelerate its migration to the cloud and bring innovative solutions that help advertisers serve customers all over the world,” Garman said.

AWS Leads Market

AWS holds the largest estimated share of the cloud infrastructure market at 34%, as of Q3 2022, according to Synergy Research Group. Microsoft Azure holds the second position at 21%.

To maintain that position, it steadily expands its services to support virtually any cloud workload. Currently, it has more than 200 services, from compute to artificial intelligence (AI) and from the Internet of Things (IoT) to application development.

Gartner analyst Raj Bala noted that AWS was named a leader and scored highest in its recent “Magic Quadrant” for cloud infrastructure and platform services (CIPS).

“AWS has a future focus on expanding the size of the market it serves by moving into new territory, such as private 5G and partnerships with telecoms,” Bala said.

“AWS continues to have the greatest breadth and depth of capabilities of any provider in the market for CIPS.”

See more: Best Cloud Service Providers & Platforms

]]>
5 Top Web Application Firewall Trends in 2023 https://www.datamation.com/security/web-application-firewall-trends/ Fri, 20 Jan 2023 22:27:27 +0000 https://www.datamation.com/?p=23804 Web application firewalls (WAFs) are designed to protect web applications. They achieve this via techniques such as filtering, monitoring, and blocking of malicious HTTP/S traffic that penetrates or attempts to penetrate web applications.

Web application firewall technology is a critical part of a company’s cybersecurity efforts. To help organizations keep up with this long-standing solution, here are some of the top trends in the web application firewall market:

1. WAF Growth Surge

Web application firewalls may not be the most cutting-edge technology around. Yet, they continue to play a vital role in enterprise storage and represent a high-growth segment of the security market. Market research numbers show demand for overall firewall solutions, including WAFs, with 14% annual growth, according to Dell’Oro Group.

“Firewalls are foundational to good enterprise network security hygiene, and we do not foresee any solution fully displacing them over the next five years,” said Mauricio Sanchez, an analyst at Dell’Oro Group.

Sanchez pointed out that web application firewall revenue surged by over 30% during 2022 and estimated annual revenue in excess of $2 billion for the year. Three top WAF vendors, Akamai, Cloudflare, and F5 Networks, now represent over half the market by revenue, according to Dell’Oro Group.

See more: 5 Top Firewall Trends

2. WAFs No Longer Enough

While a WAF remains an important security tool, it relies on signatures to identify and block suspicious activity, according to Pete Klimek, director of technology, office of the CTO, Imperva.

For most digital businesses, this is not enough to stop the growing number of automated and complex security threats. Automated fraud, business logic attacks, and other forms of API abuse don’t rely on known attack patterns, making them difficult for a web application firewall to identify and block.

Further, as businesses leverage the cloud, applications grow more complex and monolithic applications have decomposed into APIs, microservices, and serverless functions. In addition, many web application firewall offerings are challenging to deploy in hybrid or cloud-native environments.

As a result, organizations are now looking to invest in web application and API Protection (WAAP), said Klimek. With a unified, single-stack approach, a cloud-based WAAP provides multiple layers of security in the forms of WAF, API security, distributed denial of service (DDoS) protection, and advanced bot protection.

‘WAAP can be deployed in nearly any environment and equips security teams with a singular view of their attack landscape, giving them the ability to identify initial signs of malicious behavior and mitigate multi-vector attacks,” Klimek said.

3. Identity-Based Approach

Mike Kiser, director of strategy and standards at SailPoint, concurs that web application firewalls are no longer enough.

Kiser regards them as a barrier to entry for enterprises. Organizations have been combining web application firewalls with other protections that are edge-focused, such as bot mitigation and API security, he said. Ultimately, these capabilities can help protect applications, but he believes their impact is limited on their own. Design-level choices must be made to adequately protect the identity-centric security model of the application layer.

“This is most effectively accomplished through a consistent approach to identity: limiting the impact of a compromised account, being able to detect strange user behavior and lateral movement, and being able to govern the use of identity with an audit trail are key,” Kiser said.

See more: 10 Best Identity and Access Management (IAM) Solutions

4. WAF Sophistication

Michael Tremante, product manager at Cloudflare, has a slightly different take. He thinks web application firewalls are gradually becoming more sophisticated, and computationally expensive, anomaly detection systems.

His rationale? Traditional attacks are well understood and although still very much used by attackers, process flow anomalies both in end user and API-based interfaces are a recent focus point.

For example, automatically being able to detect and alert on whenever a user has performed an online banking currency transaction outside of normal expected steps/time taken. in real-time, not doing log post-processing. Doing this at scale is the next challenge being solved.

“More sophisticated attackers and bots are driving the barrier higher,” Tremante said.

“Traditional on-premise WAFs are not able to handle these detections. For large environments that have more data, it’s expensive. Native cloud-based WAFs are better suited, assuming they have the technology in place, to sustain this demand.”

5. Don’t Forget Patching

Application firewall attacks are very common to large and small businesses for a variety of reasons.

While a web application firewall acts as a proxy, which manages the traffic between an application server and its clients, attackers have become smarter and more proficient. They routinely look for vulnerabilities in this space.

Robert Anderson Jr., Chairman and CEO, Cyber Defense Labs, said that cybercriminals continue to exploit unpatched systems, including unpatched firewalls and web application firewall software.

“It is important to automate and customize patching for Windows, macOS, and Linux and everything else,” Anderson said.

“As companies still do not make sure all the patches have been taken and are fixed, they continue to suffer from large-scale ransomware and intellectual property (IP) theft.”

See more: Top 12 Web Application Firewall (WAF) Solutions

]]>
5 Top Vulnerability Management Trends in 2023 https://www.datamation.com/security/vulnerability-management-trends/ Fri, 20 Jan 2023 21:54:38 +0000 https://www.datamation.com/?p=23800 Vulnerability management seeks to lower risk by identifying and dealing with any possible lines of incursion into a network by cybercriminals.

The field of vulnerability management includes automated scans, configuration management, regular penetration testing, patching, keeping track of various metrics, and reporting. The category has been evolving rapidly within cybersecurity, and here are some of the top trends in the vulnerability management market:

1. More Than Scans

Vulnerability management is all about identifying, prioritizing, and remediating vulnerabilities in software.

As such, it encompasses far more than the running of vulnerability scans repeatedly to look for known weaknesses lurking within the infrastructure. Traditionally, vulnerability management also includes patch management and IT asset management. It addresses misconfiguration or code issues that could allow an attacker to exploit an environment as well as flaws or holes in device firmware, operating systems, and applications running on a wide range of devices.

“These vulnerabilities can be found in various parts of a system, from low-level device firmware to the operating system all the way through to software applications running on the device,” said Jeremy Linden, senior director of product management, Asimily.

See more: A holistic approach to vulnerability management solidifies cyber defenses

2. Vulnerability Management Broadens

Some analysts and vendors stick strictly to the NIST definition when they’re talking about vulnerability management. Others include security information and event management (SIEM) with vulnerability management as part of larger suites. And a few combine it with threat intelligence, which prioritizes actions and helps IT to know what to do and in what order.

Gartner recently originated the new term attack surface management (ASM). The analyst defines ASM as the “combination of people, processes, technologies, and services deployed to continuously discover, inventory, and manage an organization’s assets.”

ASM tools are said to go beyond vulnerability management. The aim is to improve asset visibility, understand potential attack paths, provide audit compliance reporting, and offer actionable intelligence and metrics.

3. Vulnerability Management as a Service

The as-a-service trend has invaded so many areas of IT, so it’s no wonder that vulnerability management as a service has emerged.

“With more than 20K vulnerabilities found and published in a single year, vulnerability management has become an enormous task,” said Michael Tremante, product manager, Cloudflare.

“This is made worse for large enterprises who also have the challenge of not necessarily knowing the full set of software components being used internally by the organization, potentially putting the company at risk. A big trend is adoption of managed services/SaaS environments, as they are externally managed, and offloading of vulnerability management to third parties.”

Thus, a growing set of products are hitting the market that help companies tackle vulnerability management via managed services of one kind or another.

See more: Vulnerability Management as a Service (VMaaS): Ultimate Guide

4. Container Vulnerabilities

The container security market is growing steadily. It is expected to be worth more than $2.5 billion by 2025, according to analyst firm KuppingerCole.
Containers and Kubernetes have become largely synonymous with modern DevOps methodologies, continuous delivery, deployment automation, and managing cloud-native applications and services.

However, the need to secure containerized applications at every layer of the underlying infrastructure — from bare-metal hardware to the network to the control plane of the orchestration platform itself — and at every stage of the development life cycle — from coding and testing to deployment and operations — means that container security must cover the whole spectrum of cybersecurity and then some, said KuppingerCole.

Vulnerability management platforms are gradually adopting features aimed squarely at containerized environments. Several vendors have announced new container vulnerability scanning and vulnerability management features. Expect these to become a barrier to entry in the near future.

See more: Securing Container and Kubernetes Ecosystems

5. Autonomous Endpoint Approach

Due to the way the threat landscape is evolving, the way vulnerability management platforms are shifting, and the fast pace of innovation as evidenced by containerization, digitalization, and the cloud, a new approach is needed, according to Ashley Leonard, CEO, Syxsense.

“Businesses possess incredibly powerful processors inside storage equipment, servers, and desktops, which are underutilized in many cases” Leonard said.

“Many of the tasks managed today by the cloud could be better performed at the endpoint — and we will begin to see some functions decentralized onto endpoints to take advantage of this untapped compute potential.”

For example, Syxsense has been incorporating more features into its vulnerability management tools. This includes more orchestration and automation capabilities, stronger endpoint capabilities, and mobile device management. These augment existing patch management, vulnerability scanning, remediation, and IT management capabilities.

See more: 12 Top Vulnerability Management Tools

]]>
10 Top Vulnerability Scanning Trends in 2023 https://www.datamation.com/security/vulnerability-scanning-trends/ Fri, 20 Jan 2023 20:55:14 +0000 https://www.datamation.com/?p=23532 Vulnerabilities are everywhere. Whether due to sloppy passwords, misconfigurations, unpatched systems, or zero-day attacks, organizations need to be on the alert for any potential issues. Vulnerability scanning is an essential part of the cybersecurity arsenal in finding such vulnerabilities.

A vulnerability is defined by the International Organization for Standardization (ISO) 27002, as “a weakness of an asset or group of assets that can be exploited by one or more threats.” Threats are defined as whatever can exploit a vulnerability, and damage can be caused by the open vulnerability being exploited by a threat. Here are some of the top trends in the vulnerability scanning market:

1. Government Warning

The importance of vulnerability scanning was underscored in a recent directive by the Cybersecurity and Infrastructure Security Agency (CISA) of the FBI.

The directive made it mandatory for government entities to do continuous vulnerability scanning on all network appliances. They have been given until April 3, 2023 to comply.

They are required to list any vulnerabilities found across all assets running on their systems. This has to be done every 14 days, and scanning should be done regularly within these 14-day windows. Further, all vulnerability detection signatures used by these agencies are to be updated at an interval no greater than 24 hours from the last vendor-released signature update. Mobile devices are included in these requirements.

Clearly, government systems have suffered badly due to undetected and un-remediated vulnerabilities. Enterprises and SMBs are no different. They would do well to heed these CISA directives.

2. Constant Alertness

Robert Anderson Jr., chairman and CEO, Cyber Defense Labs, believes vulnerability scanning has not been thorough enough in the enterprise.

While vulnerability management is supposed to be constantly looking at and protecting all endpoints, workstations, laptops, servers, virtual machines (VMs), web servers, and databases, Anderson said that most companies only cover what they deem is important.

“Companies need to constantly be looking for vulnerabilities that may be used as an attack path by an adversary,” Anderson said.

“Continual scanning is now being utilized by most large companies that we partner with. The need for unified and constant visibility of your distributed IT network irrespective of endpoints is imperative in today’s cyberthreat environment.”

3. Golden Oldies

Zero-day attacks get the lion’s share of attention — and understandably. After all, they represent newly discovered vulnerabilities and exploits for which there is currently no remedy, although their publication means remedies will be issued rapidly.

Yet, well-known and sometimes quite old vulnerabilities continue to exist in many enterprises.

For example, Log4J has been well known for more than a year. Yet, cybercriminals continue to exploit it.

“As the Log4j vulnerability shows, discovering, mitigating, and fixing vulnerabilities as soon as possible is more important than ever to good cyber hygiene,” said Michelle Abraham, an analyst at IDC.

“Leaving vulnerabilities without action exposes organizations to endless risk, since vulnerabilities may leave the news but not the minds of attackers.”

Unpatched vulnerabilities even older than Log4j are lurking inside many companies. Some as far back as a decade old. When cybercriminals find these, they know they have an easy route into the enterprise. Vulnerability scanners need to be employed to find these, and organizations need to ensure they are patched immediately.

See more: Cybersecurity Agencies Reveal the Top Exploited Vulnerabilities

4. Update Your Vulnerability Databases

Part of the solution to not missing aging vulnerabilities is to ensure vulnerability scanners use a database of known issues to look for vulnerabilities, misconfigurations, or code flaws that pose potential cybersecurity risks.

Further, that database needs to be complete and regularly updated.

Popular scanners are missing at least 3.5% of all ransomware vulnerabilities, according to the Ivanti ”Ransomware Report.” As well as keeping databases and vulnerability signatures up to date, some recommend using multiple scanners.

5. Include Penetration Testing

Vulnerability scanning is essentially a process of checking out where weaknesses may lie by assessing internal systems, applications, misconfigurations, and cloud dependencies.

Penetration testing takes a different approach. It is generally accomplished by ethical hackers who try to penetrate the network, find holes, and exploit known or unknown vulnerabilities. More organizations are supporting vulnerability scanning with pen testing to ensure they find everything.

For those that lack internal resources, vulnerability scanning and penetration testing are now available as a service. This is a growing trend. Penetration testing-as-a-service (PTaaS) platforms have emerged that remove the burden of testing from IT or the need to hire outside hackers.

See more: 5 Top Penetration Testing Trends

6. Personal Information

Personally identifiable information (PII) is very much in the spotlight. Cybercriminals seek it, as it provides them with data they can sell, compromise, or use to hack into systems and scam people and organizations.

Similarly, organizations are constantly looking for PII, so they can ensure it is protected and they don’t fall afoul of privacy and compliance mandates. Accordingly, vulnerability scanners are emerging that look for PII as well as vulnerabilities.

“As the awareness of better privacy for customers’ sensitive data is rising, so does the number of solutions that help gain insights around privacy posture using scanning tools,” said Gil Dabah, co-founder and CEO at Piiano.

“Vulnerabilities recognized by scanning tools are including additional findings that are privacy related.”

Imagine a company with hundreds or thousands of developers that decide to harden the security of the PII they are collecting to decrease the risk of data exfiltration due to a breach. Such a task, when done manually, can take weeks.

With code scanning tools, a team can get a list of all PII the organization collects, verify that the data you collect is aligned with your privacy policy, and better protect high-risk PII, such as SSNs. New tools help find PII, but they also give you insights regarding where you collect each PII, what processes you are doing with the data, and where you store it.

7. SBOMs

A software bill of materials or SBOM is an inventory of ingredients that make up different software components. They are being used to be able to drill down into exactly where vulnerabilities may lurk.

Take the recent Log4j vulnerability. As it related to Java libraries, few realized how pervasive those libraries were. Organizations thought they have patched or addressed all needed areas to combat Log4J. Yet, there were more hiding in all sorts of nooks and crannies of the enterprise. SBOMs make it easier to know what contains which software elements, so it is easier to address vulnerabilities.

“The move toward automated, formally structured, machine-readable SBOMs is clear,” said Alex Rybak, senior director of product management, Revenera.

“More and more software companies expect SBOMs to include all third-party, including open-source and commercial, software that’s used in their applications. An SBOM that provides a single, actionable view is essential, so that when a vulnerability is detected, the supplier can quickly assess the impact to their portfolio of applications and expedite remediation plans.”

8. Supply Chain Attacks

Major cyberattacks have made it clear that vulnerabilities within the software supply chain were a vital element of security scans.

Cyberattackers gained a foothold by exploiting an outdated build server with a known vulnerability. Since those well-publicized breaches, further examples include RCE vulnerability (CVE 2021-22205) and dozens of vulnerable Jenkins plugins. They demonstrate the importance of securing development tools and their ecosystems.

“Organizations have expanded their vulnerability scanning efforts from COTS, cloud, and source code to include the software delivery pipeline itself,” said Andrew Fife, VP of marketing, Cycode.

“While much of the hype around software supply chain attacks has been directed at traditional software composition analysis, which focuses on the delivered application, the reality is that the majority of attacks start elsewhere.”

9. Scanning for BEC And BAC

Business application compromise (BAC) is where cyberattackers target cloud access identity providers, like OKTA or OneLogin, that are often used by business applications to provide a single sign-on (SSO) experience to users.

Attackers compromise the user OKTA login via phishing and overcome multi-factor authentication (MFA) by brute force, pushing notifications in the hope that the user accidentally approves one of them.

Business email compromise (BEC) most often happens in Microsoft 365. Criminals send an email message that appears to come from a known source making a legitimate request. With original deployments of Office 365 tenants, Microsoft by default enables IMAP and POP3 in O365 Exchange as well as BasicAuthentication. IMAP and POP3 don’t support MFA, so even if you have MFA enabled, attackers can still access these mailboxes.

“Disable legacy protocols, like IMAP and POP3, immediately, especially if you’ve gone through the process to enable MFA,” said A.N. Ananth, president and chief strategy officer, Netsurion.

“Once you turn those off, strongly consider disabling BasicAuthentication to prevent any pre-auth headaches on your Office 365 tenants.”

To address BAC, Ananth said to be alert for multiple identity provider sessions from the same user with multiple, non-mobile operating systems. Alert for potential brute force push requests. As a result of this type of threat, scanners are now checking for such vulnerabilities.

See more: Simple Guide to Vulnerability Scanning Best Practices

10. Automated Remediation

The norm has long been that multiple tools are needed to bridge the scanning and remediation gap. Scanners find out what might be wrong. Other tools, and plenty of manual effort, are required to address the problems and safeguard the enterprise.

But that is changing according to Ashley Leonard, CEO of Syxsense. His company, for example, offers a single agent that automates the management of endpoints and reduces the attack surface.

“We are seeing solutions hitting the market that combine the necessary functionality to remediate threats that are blended: threats that require the application of a patch as well as configuration changes,” Leonard said.

“This ties in with threat prioritization whereby both patch and security threats are given different levels of risk based on the specifics of their environments. And finally, we are seeing software designed to bring about intelligent endpoints that can automatically maintain an endpoint in a desired state.”

See more: 22 Best Vulnerability Scanner Tools

]]>
Cisco Report Shows Cybersecurity Resilience as Top of Mind https://www.datamation.com/security/cisco-report-shows-cybersecurity-resilience-top-mind/ Thu, 19 Jan 2023 20:04:28 +0000 https://www.datamation.com/?p=23796 A new report indicates that cybersecurity resilience is a top focus for companies as they look to defend themselves in the modern threat landscape.

Based on survey responses from over 4,700 IT pros across 26 countries, 96% said security resilience is a high priority, according to Cisco’s annual ”Security Outcomes Report, Volume 3: Achieving Security Resilience,” which the company released last month.

The reason is not hard to fathom. In the past two years, most experienced a security event that impacted business. Network or data breaches (51.5%), network or system outages (51.1%), ransomware events (46.7%), and distributed denial of service attacks (DDoS) (46%) were the leading types of incidents.

Such incidents led to a host of problems across the enterprise. These included IT and communications interruption (62.6%), supply chain disruption (43%), impaired internal operations (41.4%), and lasting brand damage (39.7%).

Achieving Security Resilience

The report delves into the factors that could provide the biggest gains in enterprise security resilience, whether based on culture, IT environment, or security technology.

Cisco took these factors and devised a security resilience scoring system based on seven areas. Those most closely adhering to these core principles are in the top 10% of resilient businesses. Those missing most of these elements are in the bottom 10%.

Culture is especially vital. Those with poor security support from the C-suite score 39% lower than those with strong executive support. Similarly, those with a thriving security culture score 46% higher than those lacking it.

But it isn’t all about culture. Staffing, too, played a definite role, whether based on experienced staff, certification and training, or the sheer number of internal resources. The report shows those companies maintaining extra internal staffing and resources to respond to incidents gain a 15% boost in resilient outcomes.

In other words, headcount can mean the difference between faring well and poorly during an event. Those organizations trying to get by with as few IT or security personnel as possible are cautioned to consider a change of approach.

Hybrid Hiccups

In addition, the report compares levels of resilience between on-premises, public cloud, and hybrid environments. Those adopting a mostly on-premises or mostly cloud approach score well on resiliency with neither one dominating.

It appears that a commitment to be wholly public cloud or wholly on-premises impacts cyber resilience positively.

It is those in the midst of the transition or who aren’t quite sure whether to invest in more cloud or bring things back in-house who need to be most wary. The report highlights the need for caution in the switch to the cloud, particularly for those in the initial stages of the move or those establishing a hybrid cloud environment. Their scores drop between 8.5% and 14% in terms of resiliency, depending on how difficult the hybrid environments are to manage.

Businesses need to take care to reduce complexity when transitioning from on-premises to the cloud, according to Helen Patton, CISO, Cisco Security Business Group.

See more: 10 Top Hybrid Cloud Trends

Zero Trust, XDR, and SASE Improve Resilience

The Cisco report emphasizes the importance of recent developments on the security front, such as zero-trust network architecture (ZTNA), extended detection and response (XDR), and secure access services edge (SASE).

While culture, head count, and architecture all influence cyber resiliency, so too does the adoption of the right security solutions. Zero trust, for example, increases scores by 30%: Those adopting the zero-trust model, principles, and associated technologies achieved a higher level of resilience.

Additional technologies that make a difference are XDR, which correlated to a 45% increase in resilience for organizations adopting it compared to those lacking detection and response solutions. Similarly, those simplifying their infrastructure through the convergence of networking and security courtesy of SASE see a jump in security resilience of 27%.

See more: How to Build a Zero-Trust Network Model

“Value is protected”

Companies need the ability to “anticipate, identify, and withstand cyberthreats, and if breached, be able to rapidly recover from one,” said Patton with Cisco.

“That is what building resilience is all about,” Patton said. “Security, after all, is a risk business. As companies don’t secure everything, everywhere, security resilience allows them to focus their security resources on the pieces of the business that add the most value to an organization and ensure that value is protected.”

The “Security Outcomes Report” is “a study into what works and what doesn’t in cybersecurity,” said Jeetu Patel, EVP and GM of security and collaboration, Cisco.

“The ultimate goal is to cut through the noise in the market by identifying practices that lead to more secure outcomes for defenders.”

See more: Cisco’s “Security Outcomes Report, Volume 3: Achieving Security Resilience”

Convergence in the Network Security Market

Cisco is a leader in the convergence of security and networking. And with good reason. The worldwide network security market has maintained double-digit revenue growth for eight consecutive quarters, according to Dell’Oro Group.

Mauricio Sanchez, an analyst at Dell’Oro Group, explained that enterprises are beginning to think differently about networking and security. Instead of considering them as separate toolsets to be deployed once and infrequently changed, convergence is taking root.

“The vendor community has responded with a service-centric, cloud-based technology solution that provides network connectivity and enforces security between users, devices, and applications,” Sanchez said.

“SASE utilizes centrally controlled, internet-based networks with built-in advanced networking and security-processing capabilities. By addressing the shortcomings of past network and security architectures and improving recent solutions — in particular, SD-WAN and cloud-based network security — SASE aims to bring networking and security into a unified service offering.”

See more: 5 Top Network Security Trends

]]>
Microsoft Rolling Out Supply Chain Platform https://www.datamation.com/applications/microsoft-rolling-out-supply-chain-platform/ Mon, 19 Dec 2022 21:39:04 +0000 https://www.datamation.com/?p=23679 REDMOND, Wash. — Microsoft is targeting the supply chain market with its latest software release.

The Microsoft Supply Chain Platform is designed to help organizations maximize their supply chain data estate investment via a combination of Microsoft artificial intelligence (AI), collaboration, low code, security, and SaaS applications within one overarching platform, according to the company last month.

This supply chain software rollout by Microsoft comes at a time of supply chain disruption worldwide. Whether due to COVID-19 lockdowns, the Great Recession, the “Great Resignation,” quiet quitting, layoffs, legislation that impacted trucking and shipping, the war in Ukraine, or other factors, the global supply chain has stuttered of late. Chip shortages, cabling shortages, and much longer lead times for equipment have become the norm.

Supply chain dovetails nicely into existing Microsoft strengths in enterprise resource planning (ERP), customer relationship management (CRM), collaboration, project management, and the cloud.

Microsoft Supply Chain Platform

The Microsoft Supply Chain Platform makes use of building blocks across Azure, Dynamics 365, Microsoft Teams, and the Power Platform for the development of enhanced supply chain capabilities. For example, a feature known as Dataverse enables users to create thousands of connectors to gain visibility across existing supply chain systems. They can use it to develop custom workflows using low-code solutions within the Power Platform. In addition, they can collaborate internally and externally on security using Microsoft Teams.

Existing Microsoft partners within its already extensive ecosystem can use it to enable supply chain resiliency and agility for their own customers. Some will use it to carve out a niche of supply chain and domain expertise that piggybacks on other offerings, such as Dynamics 365 Supply Chain Management, Microsoft Azure, Microsoft Teams, and Power Platform.

Command Center

At the core of the Supply Chain Platform is the Microsoft Supply Chain Center, which provides a command center experience that can harmonize data from across existing infrastructure supply chain systems, such as data from Dynamics 365 and other enterprise resource planning (ERP) providers, including SAP and Oracle, along with stand-alone supply chain systems.

Microsoft Supply Chain Center, therefore, will be welcomed in some quarters as a ready-made command center for supply chain visibility and transformation. It can work natively with an organization’s supply chain data and applications to add more comprehensive collaboration, supply and demand insights, and order management. Note that Dynamics 365 Supply Chain Management customers can automatically gain access to Supply Chain Center.

Within Supply Chain Center, there are several components. Data Manager enables data ingestion and orchestration from current systems of execution. A supply and demand insights module leverages Azure AI models to predict upstream supply constraints and shortages as well as perform simulations. Smart news insights provides relevant news alerts on external events. An order management module orchestrates fulfillment and automates it with a rules-based system through real-time omnichannel inventory data, machine learning (ML), and AI.

“Petabytes of data”

“Businesses are dealing with petabytes of data spread across legacy systems, ERP, supply chain management, and point solutions, resulting in a fragmented view of the supply chain,” said Charles Lamanna, corporate VP, Microsoft Business Applications and Platforms.

“Supply chain agility and resilience are directly tied to how well organizations connect and orchestrate their data across all relevant systems.”

Lamanna said Microsoft Supply Chain Platform and Supply Chain Center “enable organizations to make the most of their existing investments to gain insights and act quickly.”

Supply chain solutions are “more critical than ever,” said Daniel Newman, founding partner and principal analyst, Futurum Research.

“Our early assessment of the Microsoft Supply Chain Platform and Supply Chain Center is that the company has put its technology, applications, and resources together in a way that will serve its customer base well in a wide swath of IT and operations environments, offering flexibility for diverse IT environments and continuous agility for transformation into the future,” Newman said.

Microsoft’s Recent Activity

Of late, Microsoft has been making waves via aggressive moves to expand the number of cloud markets it locally serves as well as the capacity of those locally situated data centers. Hardly a month goes by without yet another announcement about a new territory or two.

Similarly, AI has regularly been featured in company announcements in 2022, as have expansion of the capabilities of Teams and its adoption across the enterprise landscape.

Backing everything, the company places high value on expanded partnerships and building out its channel partner ecosystem. There are regular announcements about new partners, improved channel programs, and features aimed at broader collaboration.

Growth of the Supply Chain Market

The global supply chain management market size was estimated to be valued at $18.5 billion in 2021, according to Grandview Research.

But with an expected expansion rate of 11% per year between 2022 and 2030, this market could be well in excess of $50 billion by the end of the decade.

See more: How Supply Chains Can Improve Demand Forecasting

]]>
IBM Releases Data Analytics Software to ‘Break Down’ Silos https://www.datamation.com/big-data/ibm-releases-data-analytics-software-break-down-silos/ Mon, 19 Dec 2022 16:36:32 +0000 https://www.datamation.com/?p=23663 ARMONK, New York — IBM’s latest analytics software offering is designed to help enterprises break down data and analytics silos to facilitate better and faster decision making and address unpredictable disruptions.

Known as IBM Business Analytics Enterprise, the suite covers business intelligence (BI) planning, budgeting, reporting, forecasting, and provides dashboards of data sources in use across the business, according to IBM last month.

IBM Business Analytics Enterprise

IBM Business Analytics Enterprise is designed to make sharing easier, avoid duplicate content, and protect information while offering a single point of entry to view the data, regardless of which BI or analytics system it resides in.

Take the case of the many sales, HR, and operations teams running inside one organization. Each requires access to data and insights from different business intelligence and planning tools. One department may wish to optimize sales goals while others want to create workforce forecasts or predict operational capacity. Problems and complexity can arise, though, when it’s necessary to share data and reporting across departments due to the use of multiple analytics solutions.

IBM Analytics Content Hub

In addition, this release incorporates a new IBM Analytics Content Hub that helps streamline how users discover and access analytics and planning tools from multiple vendors by presenting everything in a single view. Such capabilities are becoming increasingly essential due to skills shortages, tightening regulations, and the overall complexity of storing data across disparate silos, whether on prem or in the cloud. By arming themselves with this new tool, businesses can become more data driven as a way to differentiate themselves.

The content hub not only works with IBM Business Analytics, IBM Cognos Analytics with Watson, and IBM Planning Analytics with Watson, it also operates across other common business intelligence tools. Dashboards can be tailored by the user to specific needs. Algorithms recommend role-based content and rapidly compile reports. The system is designed to learn from usage patterns to improve its recommendations.

Analytics Upgrades

IBM has also upgraded a couple of its existing analytics and AI tools. IBM Cognos Analytics with Watson now has integration capabilities and better forecasting that considers multiple factors and seasons in trend predictions.

IBM Planning Analytics with Watson is being made available as-a-service on Amazon Web Services (AWS).

“More complete picture”

“Businesses today are trying to become more data driven than ever as they navigate the unexpected in the face of supply chain disruptions, labor and skills shortages, and regulatory changes,” said Dinesh Nirmal, GM of data, AI, and automation, IBM.

“But to truly be data driven, organizations need to be able to provide different teams with comprehensive access to analytics tools and a more complete picture of their business data, without jeopardizing their compliance, security, or privacy programs. IBM Business Analytics Enterprise offers a way to bring together analytics tools in a single view, regardless of which vendor it comes from or where the data resides.”

IBM’s Recent Activity

IBM continues to develop Watson along with other analytics, BI, and AI tools.

Recent news includes a foray into the data observability market with the acquisition of Databand.ai. This technology helps enterprises catch bad data at the source. It gives IBM greater observability to the full stack of capabilities for IT across infrastructure, applications, data, and machine learning (ML).

In addition, the company has been spending big to enhance its data fabric. Forrester Research gives IBM high marks in its recent report on the data fabric market. The data fabric is used by IBM to dynamically and intelligently orchestrate governed data across a distributed landscape to provide a common data foundation for data consumers. This lies at the foundation of good analytics and AI.

Advanced Analytics Aid Growth

IBM has a long history in the analytics and AI markets. IBM has maintained a steady 5% to 10% market share of the overall business analytics market over the long term, according to IDC. The bulk of the market is now owned by lower-end services, such as Google Analytics.

IBM lives more in the high-end and is consolidating its position there. It wants to equip its customers with insights that make a major difference. Recent research from Forrester Research indicates the impact of this type of analytics.

“Firms with advanced insights-driven business (IDB) capabilities continue to outpace their competition and deliver better growth than less mature firms,” said Boris Evelson, an analyst at Forrester Research.

“What differentiates these firms is that they have consistently invested time, effort, and resources across the five IDB competencies: strategy, data, platforms, internal partners, and practices.”

Per Forrester, they are eight times more likely to grow by 20% or more, have a higher ability to use insights to discover new sources of revenue and create market differentiation, and can more frequently commercialize their data insights. These advanced insights-driven organizations are also 1.6x more likely to report using data, analytics, and insights to create experiences, products, and services that differentiate them within the market when compared to beginners.

See more: Top 5 Data Analytics Trends

]]>
Dell Technologies Expands Data Protection Line https://www.datamation.com/security/dell-technologies-expands-data-protection-line/ Mon, 19 Dec 2022 16:24:31 +0000 https://www.datamation.com/?p=23657 ROUND ROCK, Texas — Dell Technologies is extending its data protections offerings to improve overall cyber resiliency in multicloud environments.

These include Dell PowerProtect Data Manager software advancements, a new appliance, broader cyber recovery in public clouds, and zero-trust multicloud data protection as well as more flexible backup storage as-a-service and a guarantee of cyber recovery, according to the company last month.

The Dell PowerProtect Data Manager Appliance is said to be simple to use and easy to consume. It incorporates artificial intelligence (AI)-powered resilience and operational security features aimed at accelerating the adoption of zero-trust architectures and protecting against potential threats and cyberattacks.

Key Findings in 2022 Dell “Global Data Protection Index”

Dell conducted an in-depth survey of its customers that is covered in the 2022 Dell “Global Data Protection Index.”

Findings:

  • In the past year, cyberattacks accounted for 48% of all disasters, up from 37% in 2021, and leading all other causes of data disruption
  • 85% of organizations using multiple data protection vendors see a benefit in consolidation
  • Organizations using a single data protection vendor incurred 34% less cost recovering from cyberattacks or other cyber incidents than those that used multiple vendors
  • 91% of organizations are either aware of or planning to deploy a zero-trust architecture but only 23% are deploying a zero-trust model and 12% have it fully deployed

Embedding Zero Trust

The survey highlighted the fact that zero trust remains largely undeployed in enterprises.

Dell aims to rectify that by incorporating the zero-trust philosophy into its products and services, so customers don’t need to add yet another layer of security tools on top of their existing infrastructures. With embedded security features designed into the hardware, firmware, and security control points, this approach helps organizations achieve zero-trust architectures to strengthen cyber resiliency and reduce security complexity.

See more: Overcoming Zero-Trust Security Challenges

Dell PowerProtect Data Manager Appliance

The Dell PowerProtect Data Manager Appliance is available initially for small and mid-sized use cases with support that scales from 12 to 96 TB of data. It offers a software-defined (SD) architecture for automated discovery and protection of assets, including VMware protection with Transparent Snapshots that ensure the VM availability. Identity and access management (IAM) capabilities are also built in.

Dell PowerProtect Data Manager software within the appliance addresses cyber resiliency and supports zero-trust principles, such as multifactor authentication (MFA), dual authorization, and role-based access controls.

Dell PowerProtect Cyber Recovery

For fast cyber recovery from public cloud vaults, Dell PowerProtect Cyber Recovery for Google Cloud enables deployment of completely isolated cyber vaults in Google Cloud.

By securely separating and protecting data, it is further safeguarded from cyberattack. Access to management interfaces is locked down by networking controls and requires separate security credentials and MFA for access. Those wanting to take advantage of these services can use existing Google Cloud subscriptions to purchase PowerProtect Cyber Recovery through the Google Cloud Marketplace or directly from Dell and channel partners.

“Integrated data protection”

“With virtually everything connected to the internet, the need to protect data is more important than ever,” said Jeff Boudreau, president and general manager, infrastructure solutions group, Dell Technologies.

“Point solutions don’t go deep or wide enough to help protect organizations. Dell helps customers strengthen cyber resiliency by offering integrated data protection software, systems, and services to help ensure data and applications are protected and resilient wherever they live.”

Dell’s Recent Activity

Dell has been busy on the cybersecurity front over the past year.

Its Dell Data Protection Suite is being used increasingly to rapidly back up thousands of VMs while providing prompt recovery in the event of a service disruption or outage.

The company has been steadily expanding its security portfolio via endpoint security services, beefed up supply chain security, and upgrades across its Dell PowerProtect Appliance line as well as its trusted devices and trusted infrastructure programs.

In addition, Dell Technologies has been advancing its reputation in the enterprise space via announcements, such as PowerScale cyber protection that incorporates AI, Dell PowerProtect Cyber Recovery, and a range of business resiliency services.

Growth of the Cybersecurity Market

Dell Technologies is investing forthrightly in the cybersecurity market. And with good reason.

McKinsey studies indicate that security represents a $2 trillion market opportunity over the long term. The consulting firm placed the value of the market at around $150 billion in 2021, with cybersecurity predicted to grow at a rate of at least 12% annually.

Dell is a relatively small player in the cybersecurity market. But by incorporating more features into its existing products and services, it gradually takes away a bigger slice of business from cybersecurity vendors and rivals. An argument may be why buy from Dell and two security vendors when everything is included in some of these latest Dell products and services?

See more: Top 5 Cybersecurity Trends

]]>