Viewpoints – Digital IT News https://digitalitnews.com IT news, trends and viewpoints for a digital world Fri, 31 May 2024 02:04:41 +0000 en-US hourly 1 https://wordpress.org/?v=5.4.15 The Vital Role of EV Charging Infrastructure in Electric Vehicle Growth and Acceptance https://digitalitnews.com/the-vital-role-of-ev-charging-infrastructure-in-electric-vehicle-growth-and-acceptance/ Fri, 31 May 2024 14:00:26 +0000 https://digitalitnews.com/?p=10959 In the global race towards net zero emissions and overall sustainability, electrical vehicles (EVs) play a critical role. The rise in EV interest and use in the past decade has been encouraging. EV sales had a record-breaking year in 2023, with the United States alone seeing 1.6 million in EV sales. All signs point to [...]

The post The Vital Role of EV Charging Infrastructure in Electric Vehicle Growth and Acceptance appeared first on Digital IT News.

]]>
In the global race towards net zero emissions and overall sustainability, electrical vehicles (EVs) play a critical role. The rise in EV interest and use in the past decade has been encouraging. EV sales had a record-breaking year in 2023, with the United States alone seeing 1.6 million in EV sales. All signs point to people wanting to adopt more sustainable transportation practices. However, for this shift to move the needle toward global sustainability, the EV charging infrastructure will need to be robust enough to support expanded EV use.

The current EV charging infrastructure

To support the over 2 million EVs on US roads today, over 175,000 EV charging stations have been built and installed. This number is on par with the number of gas stations in the US. However, research has revealed acute issues with the current EV charging infrastructure.

A recent study from JD Power showed that EV drivers reported that upwards of 20% of EV charging stations were inoperable at any time. The unreliable nature of the EV charging infrastructure has led to what is being deemed “charging anxiety” on the part of EV drivers. Where once drivers were concerned about the range of their fully-charged EVs, now they are experiencing anxiety about their ability to charge their vehicles once they reach a charging station.

As there has been a greater push to embrace EVs, the problems with the EV charging station infrastructure have become more crucial to examine. Recently, the US Department of Transportation’s Federal Highway Administration’s National Electric Vehicle Infrastructure (NEVI) program has pledged to increase uptime to 97%, a lofty goal in the face of obvious issues. This goal has come at the same time as the Biden Administration’s Infrastructure Plan, which includes funds for expanded EV charging stations.

It has become apparent that for the EV Revolution to thrive, the charging infrastructure must be better supported.

Support for charging, support for EV growth

To buoy the current charging infrastructure and lay a strong foundation for future charging stations, the focus must not only be on building more stations but also on having a workforce that is trained and ready to support them.

Currently, traditional electricians and IT support technicians are being relied upon to support and maintain this new infrastructure. However, as with all new technologies and industries, a new workforce of specifically trained technicians will be required if the goals of NEVI, state and federal governments, and the EV industry itself are to be met. EV charging stations are complex networks. While traditional electricians may have the know-how to address electrical interruptions or issues with the infrastructure, they may not have the EV-specific knowledge to keep the network operational. Likewise, while IT workers may have the computer hardware or networking knowledge useful for supporting the technology that EV charging stations are built around, they may not have the specific skills to support electrical equipment.

A new component in the EV Revolution is specifically trained, certified electric vehicle service equipment (EVSE) technicians. This latest addition to the workforce is poised to make a significant difference in the nation’s race to bolster its EV charging infrastructure — and thereby support greater EV use. A well-supported charging infrastructure, maintained by specially trained professionals, will lead to greater uptime, a reduction in “charging anxiety,” and greater overall adoption of EVs.

The expanded workforce

With an expanded EV charging infrastructure comes more jobs, which is another way the EV Revolution appeals to so many in the United States. From those tasked with installing the new charging systems to those trained to maintain them, studies show that between 2021 and 2023, there has been a 21% increase in employment demand for EV and EV-adjacent jobs. Thousands of professionals will be needed in the coming years if the country is to meet the growing demand for EV use. These professionals will serve as frontline supporters of the EV Revolution, keeping it moving forward and allowing for more innovative expansion of the EV industry in the near future.

It is clear that the success of the EV Revolution greatly hinges on the support of the EV charging infrastructure. The automotive industry is on the fast track toward electrification, the benefits of which are already being felt. A single EV produces 60-68% fewer greenhouse gasses than a fuel-based vehicle over its lifetime. The more EVs on the road, the better our chances become to fight climate change. As global demand for EVs continues on an upward trajectory, a strong focus on training specialized technicians to support the planned expanded charging infrastructure will be critical. A well-planned and supported EV charging infrastructure not only supports EV use but also strengthens the economy through job creation. By prioritizing the health and expansion of the EV charging network, we can improve our defenses against climate change and pave the way for a healthier and more sustainable future for everyone on the planet.

Learn more about SkillFusion’s EV Charging Infrastructure at the website here.

Related News:

2024 US OEM EV App Report Released by JD Power

MaxiCharger Megawatt Charging System Released by Autel Energy

The post The Vital Role of EV Charging Infrastructure in Electric Vehicle Growth and Acceptance appeared first on Digital IT News.

]]>
Not All Data is Created Equal: the Value of Data Granularity https://digitalitnews.com/not-all-data-is-created-equal-the-value-of-data-granularity/ Mon, 20 May 2024 15:00:56 +0000 https://digitalitnews.com/?p=10859 Evolving into the digital age, many modern organizations have developed an understandable obsession with data. On the plus side, they’ll generously fund initiatives that support the collection and storage of data. But on the negative side, there’s little realization that this often results in collecting data for data’s sake. Executives overeager to bring their companies [...]

The post Not All Data is Created Equal: the Value of Data Granularity appeared first on Digital IT News.

]]>
Evolving into the digital age, many modern organizations have developed an understandable obsession with data. On the plus side, they’ll generously fund initiatives that support the collection and storage of data. But on the negative side, there’s little realization that this often results in collecting data for data’s sake. Executives overeager to bring their companies into the 21st century by making them data-driven are part of the way there. Yes, they have realized that data is of little value without deriving insights from it. But there is so much data being collected that it becomes difficult to see the wood for the trees, and truly extract the sort of analysis that leads to meaningful shifts in strategy. What is this critical element missing? Simply this: large sets of data can lead to misleading conclusions. The real game changer is the insights that can be pulled out of nuggets and granules of data. But the catch is that you have to know where to look.

Valuing data quantity over quality equates to looking for that proverbial needle in a haystack. While it’s true that there’s a piece of valuable information somewhere in the massive dataset, how much labor and capital will it take to retrieve it? If the wrong piece of data is retrieved and developed into a plan for action, how long will it take for the organization to course correct?

Companies on the cutting edge are now beginning to take note that the adage “data is the new oil” is out of date. Instead, these companies prioritize refining their algorithms, allowing them to run lean when it comes to data. For example, Facebook no longer has to worry about encroaching on the user’s privacy, because their advanced algorithm can use simple, publicly available data to generate the insights necessary to sell ads.

The Level of Detail in Data

A data glut riddled with hidden costs can be avoided by honing-in on the correct level of data granularity before the collection process takes place. Data granularity measures the detail present in data. For example, data that measures yearly transactions across all stores in a country would have low granularity. While data that measures individual stores’ transactions by the second would have incredibly high granularity.

However, there’s a danger in organizations assuming that increased granularity of data directly correlates with its value or applicability. When someone is lost, the farther they zoom into a map isn’t proportionate to their chances of finding their way home. There is an optimal level of data granularity for each function within an organization—a uniform level of granularity throughout an organization might benefit some functions, but hinder others.

Consider the following two examples of organizations using the right and wrong levels of data granularity. Organization A succeeds by understanding the specific price sensitivity of each of their product and consumer combinations. While organization B bleeds margin by pushing a blanket top-down price change of +5% on every product and customer combination—solely informed by the data point that costs have increased an average of 5%. Both are informed by data, but the second has such a low granularity that it will inevitably deliver poor results.

Agility Matters in Uncertain Times

The pandemic, ensuing supply chain crisis, and geopolitical instability have shed light on the holes in traditional pricing models. Traditional pricing models assume that customers are highly price sensitive, that price is the deciding factor in choosing between two comparable items. However, COVID challenged these assumptions—retailers were surprised when massive discounts did little to remedy overstocks. Inflation and public health concerns governed spending patterns. Linear pricing models could not take these external factors into account. But with more granular data, pricing models can be developed which factor in location, demographics, seasonality, and countless other intangibles.

Another disadvantage of the traditional approach to pricing is that it’s inflexible and unresponsive to rapid changes in spending patterns. With today’s volatile macroeconomic conditions, agility is crucial—alternative trade routes, suppliers, and customer bases need to be established on the fly. Predictive models try to foresee global crises but are ultimately playing an unwinnable game. On the other hand, prescriptive pricing models based on granular data react so quickly that predicting the future becomes unnecessary.

Learn to Frame Complex Problems, Let AI Do the Rest

In the coming years, training and education will place more emphasis on asking the right questions rather than answering the wrong ones. While AI can automate away tedious, manual tasks, it lacks the critical thinking skills and independence necessary to frame complex problems. Data granularity goes hand in hand with this cultural shift—the collection of data will become cheap and accessible, yet granularity issues will require the skills that make humans irreplaceable.

AI isn’t automatically added value. Without talented human capital to write prompts, the lure of these tech investments can potentially do more harm than good. For example, asking AI“How do I sell more inventory?” is the wrong question: the machine would suggest applying massive discounts across the board. Rather one should ask “how can I lift my market share, sales and margin while preserving my value perception?” — because the answer will be a balanced view of the complex outcomes that typical firms optimize for. So, the approach should be to identify the most productive goal for the machine. If you forget what you actually want, the outcome can damage the business, even as the machine gets better at doing the wrong thing. The key is setting the right goals and putting rules in place that ensures nothing critical is sacrificed along the way.

For many, transitioning business processes from being manual to being automated and data-back has been difficult—often the wrong KPIs are emphasized and organizations over-fit or under-fit analytic data models. However, by focusing on the right granularity, organizations can unlock the full potential of artificial intelligence.

To learn how data granularity can help your organization, visit the website here.

Related News:

Unified Frontiers: Revolutionizing Operations Through Centralized IT/OT Integration

SQream Integrates with Dataiku for Advanced Big Data Analytics Technology

The post Not All Data is Created Equal: the Value of Data Granularity appeared first on Digital IT News.

]]>
Top Ways to Enhance Your Cybersecurity Defenses https://digitalitnews.com/top-ways-to-enhance-your-cybersecurity-defenses/ Wed, 15 May 2024 17:00:57 +0000 https://digitalitnews.com/?p=10820 There are over 2,200 cyberattacks daily, equating to nearly one cyberattack every 39 seconds. Understandably, this has led cybersecurity to be the top concern for enterprises worldwide. With cyberthreats constantly evolving, all organizations must adapt and strengthen their defense strategies to prevent costly breaches and attacks. Here are the top tips for adapting your cybersecurity [...]

The post Top Ways to Enhance Your Cybersecurity Defenses appeared first on Digital IT News.

]]>
There are over 2,200 cyberattacks daily, equating to nearly one cyberattack every 39 seconds. Understandably, this has led cybersecurity to be the top concern for enterprises worldwide. With cyberthreats constantly evolving, all organizations must adapt and strengthen their defense strategies to prevent costly breaches and attacks.

Here are the top tips for adapting your cybersecurity defenses.

Protect Access Credentials

For years now, attackers have targeted credentials. Even after the nationwide increase in credential security, it still remains a prevalent issue. The Ponemon Institute found that 54% of security incidents were caused by credential theft. Enterprises looking to safeguard their access points should follow these three steps:

  1. Implement Phishing-Resistant Multi-factor Authentication (MFA) – According to IBM, Phishing is the most expensive initial attack vector, racking up a cost of $4.9 million in 2023 alone. Phishing-resistant MFA, such as FIDO-based authentication, adds an additional layer of security beyond traditional MFA techniques like SMS or one-time-passwords (OTPs). This reduces the likelihood of successful phishing attacks and adds a significant layer of protection.
  2. Secure Service Accounts through Privileged Access Management (PAM) – Utilizing a PAM system for managing service accounts helps in password management, rotation and obfuscation. This mitigates the risk of compromised credentials with access privileges, removing credentials as an attack path.
  3. Transition to Passwordless Authentication – Lastly, remove passwords as the primary authentication source. Moving towards a Passwordless Environment or Passwordless Experience in the environment eliminates significant vulnerabilities, reducing the likelihood of brute force attacks and credentials as an attack vector.

Enhance Breach Detection Capabilities

The number of data breaches in the U.S. has significantly increased within the past decade. Statistica reported 447 data breaches in 2012, and by 2023 they found over 3,205 total breaches. However, these numbers are just the breaches that have been reported. It’s is widely believed that the number of actual breaches far exceeds these figures. This, coupled with IBM’s estimated average total cost of $5.13 million per breach, proves how seriously companies should take this issue.

Quickly detecting “Indicators of Attack” is crucial in minimizing the impact of cyberattacks. Organizations should invest in people, processes and technology, as well as implementing Active Directory Hygiene and Identity Threat Detection that will find and shut down malicious activity before entering the organization’s IT infrastructure. With these investments, organizations must find the balance between securing their “crown jewels” and enabling their organization to operate efficiently.

  1. Invest in People, Processes and Technology – Organizations must allocate resources toward cybersecurity investments aimed at decreasing the time it takes to detect and respond to cyber threats. This involves a full-rounded approach of skilled personnel, streamlined processes and cutting-edge technologies.
  2. Focus on Active Directory (AD) Hygiene – Active Directory (AD) has continued to be the core authentication platform in most organizations making AD Hygiene the most important component. CrowdStrike found that 50 percent of organizations have experienced an Active Directory attack in the last two years, and 40 percent of those attacks were successful due to poor hygiene. By regularly reviewing and monitoring user and device identities, managing security groups and employing Privileged Access Management for AD admins, organizations stand a stronger chance at preventing and remaining secure during these attacks.
  3. Implement Identity Threat Detection and Response (ITDR) – The best way to prevent any attack is to stay proactive. ITDR solutions enable proactive monitoring and control of the Active Directory environment, enforcing early detection and response to identity-based threats.

Emphasize Attack Surface Management

Attack surface management has become increasingly popular for modern organizations. This is largely due to the projected 50% increase in cyberattacks across all industries by 2026.  A vast majority of attacked surfaces are exposed through external services and SaaS platforms. For example: organizations where known and unknown cloud-native workloads are expected to be in place, in addition to Shadow IT. When traditional perimeter security measures are undefined, attack surface management is essential to an organization’s cybersecurity defenses. Prioritizing effective management of the attack surface is indispensable for comprehensive defense-in-depth strategies. This will allow the identification and mitigation of potential vulnerabilities.

Leverage Threat Intelligence

In IBM’s 2023 Cost of Data Breach Report, they concluded that globally, organizations took an average of 204 days to identify a data breach. That said, organizations using threat intelligence identified threats 28 days faster. Threat Intelligence has proven vital in helping organizations block the “known bad” and understand and contextualize threats in and outside their environment. Additionally, operationalizing high-fidelity Threat Intelligence sources will help facilitate rapid detection by decreasing false positives and enabling incident responders to focus on decreasing the time to mitigate and remediate cyber threats.

Safeguarding your business against cyber threats requires a comprehensive strategy.  It requires proactive measures to protect access credentials, enhance breach detection capabilities, manage attack surfaces and leverage threat intelligence sources. While the occurrence and cost of breaches continue to grow, not all hope is lost. Organizations can effectively mitigate their risk by continuously by evolving and reinforcing their strategies for cybersecurity defenses.

Related News:

Verinext Named Fortinet EPSP; Engaged Preferred Services Partner

Cisco Gold Integrator Status Earned by Verinext in the Cisco Partner Program

The post Top Ways to Enhance Your Cybersecurity Defenses appeared first on Digital IT News.

]]>
Exploring Ethical Frontiers: AI and Its Impact https://digitalitnews.com/exploring-ethical-frontiers-ai-and-its-impact/ Mon, 13 May 2024 15:00:45 +0000 https://digitalitnews.com/?p=10781 Profit is a key driver in the decision-making process for the vast majority of businesses. When a new opportunity arises, assessing its potential to increase the business’s profits is central to determining if it will be embraced. However, profit isn’t the only factor businesses must weigh. An opportunity’s potential impact on people is also important [...]

The post Exploring Ethical Frontiers: AI and Its Impact appeared first on Digital IT News.

]]>
Profit is a key driver in the decision-making process for the vast majority of businesses. When a new opportunity arises, assessing its potential to increase the business’s profits is central to determining if it will be embraced. However, profit isn’t the only factor businesses must weigh. An opportunity’s potential impact on people is also important since that is where business ethics comes into play. Ethics seek to ensure business behavior is both profitable and positive for people.

The rise of artificial intelligence and the potential it brings to the business world has sparked several key concerns regarding business ethics. Many experts argue that technological concerns are just one side of the coin when it comes to developing and deploying AI. To be responsible, those exploring and leveraging the power of AI must also be mindful of its ethical implications.

Ethical concerns surrounding data privacy

Collecting data has become a core component of doing business in the digital age. Most businesses collect and store not only personal identifying information on customers but also data on their customers’ activity. Protecting that data from unauthorized access and misuse is an ethical responsibility that, in many cases, is also a regulatory obligation.

The adoption of AI in the business world has increased the ethical concerns surrounding data privacy. Training and developing AI requires vast amounts of data. In some cases, this has led businesses to collect more data. In others, it has led to data being repurposed to assist in AI training. Overall, the increased demand for data has resulted in an increased risk of privacy violations.

The ethical debate surrounding data privacy is focused on the steps businesses should take to collect and safeguard data. Most businesses agree that ethics demand they protect data against breaches, though whether or not businesses should utilize customer data for AI training is an emerging ethical debate.

Ethical concerns surrounding bias and fairness

AI’s potential to perpetuate biases has emerged as one of the primary ethical concerns surrounding its use. AI learns from the data upon which it is trained, so if the data contains biases, they can be perpetuated and amplified by AI-driven platforms, leading to discrimination, social feedback loops, and other damaging outcomes.

A simple example of the issues AI bias can cause is found in the training of AI for face-recognition applications. If data used for training doesn’t include a broad representation of races and genders, the resulting AI can cause problems for underrepresented groups, as when Facebook’s AI labeled some people as “primates”. Recent research shows that people who interact with biased AI unconsciously absorb these biases, long after using the AI in question. More serious repercussions could result if similar discriminatory biases are inherent in the training used for AI platforms that assist with hiring or financial lending.

Using AI-driven platforms to assist with healthcare diagnoses is another example of an application where biases can result in dangerous consequences. If the training protocols used to develop AI algorithms do not address biases, diagnoses can be skewed in ways that reduce the rate of accuracy for certain demographics. If the skewed findings are relied upon, the treatments prescribed to patients can result in serious harm.

An ethical approach to AI requires that biases be identified and addressed. Ideally, biases will be caught in training data and corrected before they are passed on to AI. Removing them from the machine learning models that result from training is a more difficult task than protecting against their insertion.

Ethical concerns surrounding accountability in AI

ChatGPT, which is now just one of a growing number of AI-driven chatbots, has over 100 million weekly users who pose more than 10 million queries per day. The potential for its answers to be factually incorrect or misleading is well known. In fact, the term “AI hallucination” has been coined to describe these responses.

A key ethical question: Who is accountable for those wrong answers, especially if they result in harm or loss? If an AI-driven platform provides an incorrect diagnosis for a medical condition, who is responsible — the doctor involved, the company that developed the AI platform, or the company that trained it? The ethical approach requires someone to take responsibility for the problems that flow from the use of AI.

Providing adequate transparency in the development of AI is an ethical issue closely related to accountability. The rationale behind AI’s decision-making is often unclear, even to its developers, and this “black box problem” makes it difficult to identify the cause of biases in AI’s results or assign accountability for the issues they cause.

Ironically, AI can serve as a powerful instrument in promoting ethical practices. The detection and mitigation of biases are now being enhanced by AI-driven software, which aids in identifying and correcting skewed data. Furthermore, AI plays a pivotal role in demystifying the decision-making processes of other AI systems, offering insights into their internal logic. Ethical judgments could potentially be integrated within sophisticated Large Language Models, allowing these systems to weigh ethical considerations in their outputs.

Additionally, AI contributes to the protection of training data through methods like differential privacy, where it fine-tunes the balance of noise addition and facilitates the creation of synthetic data that maintains privacy while being analytically useful. AI may be the cause of, and solution to, many of these ethical issues.

An ethical approach to AI development acts as a catalyst for innovation, ensuring that advancements are sustainable, socially responsible, and aligned with long-term regulatory visions, thereby accelerating progress.

Related News:

Without Intelligent Data Infrastructure Up to 20% of AI Initiatives Fail

Bugcrowd AI Pen Tests Introduced to Improve Confidence in AI Adoption

The post Exploring Ethical Frontiers: AI and Its Impact appeared first on Digital IT News.

]]>
Conference Rooms Benefit More from Commercial Displays than Consumer Alternatives https://digitalitnews.com/conference-rooms-benefit-more-from-commercial-displays-than-consumer-alternatives/ Wed, 08 May 2024 17:00:02 +0000 https://digitalitnews.com/?p=10752 Modern conference rooms have undergone a major evolution, incorporating advanced technologies to elevate productivity, collaboration, and overall user experience. These spaces are distinguished by their seamless integration of audio-visual equipment, creating an immersive environment for presentations, video conferences, and collaborative discussions. At the heart of conference rooms today are high-resolution displays that facilitate dynamic content [...]

The post Conference Rooms Benefit More from Commercial Displays than Consumer Alternatives appeared first on Digital IT News.

]]>
Modern conference rooms have undergone a major evolution, incorporating advanced technologies to elevate productivity, collaboration, and overall user experience. These spaces are distinguished by their seamless integration of audio-visual equipment, creating an immersive environment for presentations, video conferences, and collaborative discussions.

At the heart of conference rooms today are high-resolution displays that facilitate dynamic content sharing and serve as a focal point for presentations. When purchasing these displays, businesses encounter a crucial decision: choosing between a consumer TV or a commercial display specifically engineered to withstand the demands of a contemporary business environment.

In this article, I will present the argument for utilizing commercial displays in conference rooms, emphasizing the importance of businesses evaluating their unique use cases before making a purchase decision.

User Simplicity

Most design engineers will tell you that the desired functionality of a video conference room should enable someone to walk into the room and use a table-top touch panel to see the upcoming meeting. The user would then hit the “Start Meeting” button and then the display, and all other peripherals such as camera, microphone, etc., will startup and show the Microsoft Teams Room, Zoom Room, Meets, or other video conferencing platform.

More simply, the desired use case is a one button tap to start. When the user is finished with the meeting, the one tap to end the call is the same, touch the panel to end the meeting and the system shuts down all of the components, including the display.

CEC vs. “Wake/Sleep”

While most video bars and other video conferencing systems do send some CEC (Consumer Electronics Control) commands, a more standardized way of presenting this “one touch start” is through the commercial displays’ ability to read the HDMI Clock and Data packets coming from the video bar or room system.

For context, HDMI was initially created for the consumer market, but as adoption became more widespread, it eventually bled into the commercial market. The latest version of HDMI 2.1, for example, is able to support more demands of commercial AV systems.

This is important because CEC was created as a means to control HDMI connected devices using only one remote controller. Because of the HDMI standard’s consumer roots, that makes perfect sense. If a consumer purchased an LG TV and soundbar, it would be much more convenient to control both devices from a single remote.

The desired conference room use case, however, may see issues relying on this CEC system to appropriately control the display via HDMI as there are no real documented standards for CEC between devices and display manufacturers. Most commercial display manufacturers have integrated a wake on signal protocol into the firmware of their displays. These protocols automatically recognize when a signal is sent to the display and wakes it from sleep. It also knows to stay dark when a signal isn’t detected. This is accomplished through signal sensing and not CEC.

In some cases, the user must set the display’s menu to go to sleep with power on and not turn off completely. Moreover, there are a few different power management settings that should be enabled or disabled to get the desired wake/sleep functions on a commercial display.

Fortunately for LG customers using many partner video bars, these settings are baked into the LG firmware to automatically change when plugging in the video bar via HDMI. All they have to do is plug the video bar to power, plug the video bar into the display via HDMI and power it up.

Commercial is Always the Winner

Furthermore, wake on signal isn’t the only advantage for commercial displays over consumer options in conference rooms. Many consumer displays come with out of the box support for apps such as Amazon Prime, Netflix, Hulu and more. In a professional setting, these applications may prove unprofessional at the least or at worst break company code of conduct. There is no easy way to lock out these applications nor control them from a central location.

Commercial’s warranty is another major advantage. For most consumer displays, the warranty becomes null and void if the product is used in a commercial setting. So, if a display stops working after six months, there is nothing the company can do to replace it. On a commercial display, there are typically two-to-three-year warranties with LG offering a 3-year, and even 5-year option on most commercial models.

Commercial displays also have greater longevity due to their higher brightness and ability to stay on for much longer than consumer displays. For example, a consumer display is typically rated for a max of eight hours of use, five days a week with around 200 – 300 Nits (Nits being a unit of brightness). Commercial displays range from 300 – 700 Nits in indoor spaces, running 16 to 24 hours a day, seven days a week. These are ideal for spaces with higher sunlight or indoor working spaces with brighter lights than your typical living room.

So, while a consumer TV may be less expensive to purchase, it’s sure to lack the features and functionalities that address the unique demands of professional settings. As organizations navigate the landscape of technology options, commercial displays will help them craft a modern, reliable and efficient conferencing workspace.

To learn more about commercial displays for modern conference rooms, visit the LG Business Solution website here.

Related News:

LG Business Solutions Spotlighted at CES 2024

LG CreateBoard and IGEL Spearhead a Digital Signage Revolution

The post Conference Rooms Benefit More from Commercial Displays than Consumer Alternatives appeared first on Digital IT News.

]]>
Improving Pediatric Mental Health Care in the Digital Age https://digitalitnews.com/improving-pediatric-mental-health-care-in-the-digital-age/ Tue, 30 Apr 2024 15:59:49 +0000 https://digitalitnews.com/?p=10684 Children are now born into a digital world, so it’s natural for there to be concerns about the detrimental effects of social media, on a child’s mental health.  Studies have shown a correlation between increased social media use and symptoms of depression, anxiety, cyberbullying, sleep disturbance, and other behavioral problems among young adults. Given these [...]

The post Improving Pediatric Mental Health Care in the Digital Age appeared first on Digital IT News.

]]>
Children are now born into a digital world, so it’s natural for there to be concerns about the detrimental effects of social media, on a child’s mental health.  Studies have shown a correlation between increased social media use and symptoms of depression, anxiety, cyberbullying, sleep disturbance, and other behavioral problems among young adults. Given these alarming findings, the role of a pediatrician has never been more critical in identifying and addressing these issues and safeguarding the well-being of their young patients.

The Complexities of Screening for Mental Health Issues

Screening young people for mental health needs is a delicate process that requires a nuanced approach. Pediatricians are required to navigate the delicate balance between probing for personal, social, and clinical information while creating a safe, non-judgmental environment where children feel comfortable sharing their concerns candidly. As an example, many children are reluctant to disclose the extent of their social media activity, their feelings, their interactions, episodes of bullying, etc. especially if they fear that their parents or caregivers may react negatively to their candid answers.

A new wave of screening and triage tools can help providers identify children who may need additional support or confidential services. These tools can be integrated into the pre-arrival check-in process or telehealth platforms, allowing patients to disclose sensitive information privately before their appointment. Some studies have shown that patients are more forthcoming on a remote pre-arrival questionnaire than when they are speaking one-on-one with their doctors or in front of their guardians.

Challenges in Obtaining Honest Answers

One of the significant challenges pediatricians have to grapple with is obtaining honest and accurate information from their young patients. Their reluctance stems from not wanting to share personal information with an adult, but also a fear that their parents might gain access to that information. This can include instances of depression, sexual activity, substance abuse, social media addiction, and domestic violence. This makes it essential for pediatricians to build trust and rapport with the children and their families.

The Role of Multiple Stakeholders in Children’s Health

When addressing the impact of social media and mental health on children, it’s crucial to recognize that pediatricians are just one of the vital stakeholders. Schools, religious institutions, sports teams, special needs programs and other community organizations all play a significant role in shaping a child’s well-being and social media usage. As we say, “It takes a village,” but oftentimes disparate objectives & goals being followed by these stakeholders can cause friction when it comes to a child’s care such as the coach who believes a doctor is too abundantly cautious in suggesting when a child can safely return to the sport.

Ways Technology Can Help

  • Direct, open communication: These allow minors to communicate directly and privately with their providers removing the barrier that a guardian might present.
  • Patient Privacy and Confidentiality: Healthcare organizations need to ensure that their systems and processes comply with privacy regulations like HIPAA. This includes implementing robust security measures to protect sensitive patient data, as well as providing clear privacy policies and consent forms for patients and their parents about how information may be shared with others.
  • Secure Communication Channels: Offering telehealth services requires secure and encrypted communication channels to protect the confidentiality of patient-provider interactions. This could involve implementing video conferencing platforms with end-to-end encryption or secure messaging systems for text-based communication.
  • Separation of Patient Records: Electronic Health Record (EHR) systems should have the capability to separate and restrict access to certain parts of a patient’s record based on sensitivity and age. This would allow young patients to have private conversations with healthcare providers without their parents having automatic access to that information.
  • Age-appropriate Interfaces: Pre-arrival check-in and patient portals should be designed with age-appropriate interfaces and language to ensure that young patients can understand and navigate the system comfortably. This could involve simplified interfaces, visual cues, and clear explanations of confidentiality policies.

Conclusion

Addressing the complex interplay between social media usage, mental health, and overall well-being in minors require a collaborative and multi-faceted approach. While pediatricians play a crucial role in screening for mental health issues and providing medical guidance, it’s essential to recognize the influence of  all the other stakeholders in a child’s life, including schools, religious institutions, sports teams, and community organizations.

By working together and fostering open appropriate communication among all stakeholders, we can create a supportive and nurturing environment that empowers children to make informed decisions about their social media usage, prioritize their mental health, and thrive in today’s digital world.

In addition, by leveraging cutting-edge technologies, pediatricians have the digital tools they need to navigate the complexities of modern healthcare and address the unique challenges posed by social media. To learn more about how these technologies can improve pediatric care, visit the Yosi Health website here.

Related News:

Eagle MedWorks Connect Modular Telemedicine Cart System Announced

IGEL Collaborates with Insentra and Insight to Assist Western Health

The post Improving Pediatric Mental Health Care in the Digital Age appeared first on Digital IT News.

]]>
Harvest Now, Decrypt Later: Data Stolen Today Is at Risk in the Future https://digitalitnews.com/harvest-now-decrypt-later-data-stolen-today-is-at-risk-in-the-future/ Mon, 29 Apr 2024 15:00:36 +0000 https://digitalitnews.com/?p=10675 Quantum computing is a rapidly developing technology, with world-leading economies like the US, China and Western Europe competing to advance it. While quantum does not replace traditional computing, there are specific types of calculations that it can complete much, much faster. One such mathematical problem happens to be at the core of all current encryption [...]

The post Harvest Now, Decrypt Later: Data Stolen Today Is at Risk in the Future appeared first on Digital IT News.

]]>
Quantum computing is a rapidly developing technology, with world-leading economies like the US, China and Western Europe competing to advance it. While quantum does not replace traditional computing, there are specific types of calculations that it can complete much, much faster.

One such mathematical problem happens to be at the core of all current encryption standards. These standards have been a cornerstone of IT security worldwide for decades because, without the decryption key, decrypting data takes so long that the process is rendered pointless. However, quantum computers will not have this limitation, which will make current encryption standards useless.

Although quantum technology is not likely to reach this milestone for 5–10 years, malicious actors are already harvesting encrypted data from both public and private organizations, in anticipation of being able to decrypt and leverage it later. This article identifies the organizations most at risk and provides recommendations on how to mitigate the threat.

Who is at risk?

Harvest now, decrypt later (HNDL) attacks focus on data that will retain its value until quantum-powered decryption becomes available to unlock it, such as sensitive business information, research data and intellectual property. HNDL attacks do not target transactional data or payment card information, which lose value relatively quickly due to expiration or obsolescence.

Consequently, top targets for HNDL attacks include government bodies, especially those associated with the military. For example, back in 2015, the US Office of Personnel Management suffered a breach of approximately 21.5 million records. Some of this data is so sensitive, that its future decryption can impact lives and national security even decades after adversaries obtain it. 

Hospitals and other healthcare organizations are also at high risk for HDNL attacks. Medical records are already valued higher than, for example, credit cards or PII on the dark web. Personal healthcare information like medical conditions, histories, or genetic information is of enduring value. Breaches of health data often have a direct impact on the data subjects, which can be used to extort the victim organization and serve as a foundation for a wider attack.

Commercial organizations with long research and development cycles, such as those in the manufacturing and pharmaceutical sectors, are also in danger of having their data harvested. The nature of their business means that research can last over a decade and therefore stolen data is likely to be valuable for years.

Five Steps to Mitigate the Risk of HNDL Data Breaches

1. Identify the types of data being stored.

Technical and business teams should work together to assess the types of data that the organization possesses, along with the value and shelf life of each data type. This initial business risk assessment will drive the technical mitigation strategy. Executive buy-in is essential because project urgency, depth and costs will vary greatly depending on the results of the assessment.

2. Discover the data.

Once the organization knows which data is useful to adversaries, it needs to concentrate its security efforts on what really matters. Data discovery and classification will provide a clear understanding of the scope of the project, and visibility into data access rights will offer insight into data exposure.

3. Mitigate data risks.

Next, the organization should ensure that all data likely to be targeted in HNDL attacks is difficult to access by implementing additional security controls around it. Start with network security basics like VPN-only access to the most critical data and network segmentation. Then, rigorously enforce the least privilege principle by eliminating unnecessary permissions. Consider implementing just-in-time (JiT) access so that access privileges exist for only as long as needed.

4. Stay alert.

HNDL attacks are more likely to go undetected than other types of attacks. For example, ransomware infections become obvious as soon as the cybercriminals freeze business operations and demand a ransom. But HNDL attackers work hard to stay unnoticed so they can continue to silently harvest data for as long as possible.

To spot HDNL attacks, organizations should establish ongoing monitoring and threat detection. They should also consider implementing threat hunting, either in-house or through a third-party vendor. Security analysts will regularly examine logs for suspicious activity that could indicate adversaries lurking in the environment or signs of data exfiltration, enabling further investigation. Understanding the motivation of HNDL threat actors and which data is most attractive to them results in more tailored threat hunting.

5. Stay informed.

Quantum computing is a very expensive technology, so it is likely to appear not in someone’s basement but rather as a dedicated state-level project. Nevertheless, the threat is real and efforts to combat it have been underway for several years. In 2022, the US National Institute of Standards and Technology (NIST) announced the first 4 quantum-resistant cryptographic algorithms. In 2023, US President Biden issued a declaration that threats resulting from advancements in quantum computing constitute a national emergency.

Organizations whose assessments reveal that they are at high risk from HNDL attacks are more likely to participate in the NIST workgroups and be early adopters of new quantum-resistant encryption algorithms. The rest of us need to stay aware and learn from these early implementations.

The power of action

If organizations follow encryption best practices, the sensitive data harvested in HNDL attacks will not be immediately useful to cybercriminals. However, the rapid advancement of quantum computing technology makes it likely that they will be able to decrypt the stolen data in the near future. At that point, the victim organization could suffer serious consequences, from damaging its reputation to jeopardizing its very existence. Accordingly, it is crucial that all public and private businesses that hold evergreen sensitive data acknowledge the risks associated with data harvesting and take steps to prevent data breaches.

Related News:

Netwrix Solutions Expanded Its Global Partnership Network by 36%

Olympic Games Traveling Tips from Netwrix to Avoid Being Scammed

The post Harvest Now, Decrypt Later: Data Stolen Today Is at Risk in the Future appeared first on Digital IT News.

]]>
Unified Frontiers: Revolutionizing Operations Through Centralized IT/OT Integration https://digitalitnews.com/unified-frontiers-revolutionizing-operations-through-centralized-it-ot-integration/ Wed, 03 Apr 2024 14:00:26 +0000 https://digitalitnews.com/?p=10454 In modern manufacturing and production, there is a never-ending drive for efficiency and quality. Companies are continually working to maintain competitiveness and stay ahead of technological advancements with their production facilities while also juggling the challenges of a global supply chain and weathering economic uncertainties. The industry has embraced digitalization as a means to navigate [...]

The post Unified Frontiers: Revolutionizing Operations Through Centralized IT/OT Integration appeared first on Digital IT News.

]]>
In modern manufacturing and production, there is a never-ending drive for efficiency and quality. Companies are continually working to maintain competitiveness and stay ahead of technological advancements with their production facilities while also juggling the challenges of a global supply chain and weathering economic uncertainties.

The industry has embraced digitalization as a means to navigate these complexities, however, a critical question has emerged: is mere digitization sufficient for true modernization? This article will show that in order to realize the full potential of digital transformation, large distributed organizations need to adopt centralized IT/OT management.

Navigating the Modern Manufacturing Maze

In recent years, the manufacturing sector has witnessed an unprecedented convergence of challenges and opportunities. Economic fluctuations, technological breakthroughs, and shifting consumer demands have propelled the industry into a new era of complexity. To thrive in this landscape, manufacturers must not only adapt but also innovate, leveraging digital technologies to optimize operations and enhance competitiveness.

The Imperative of Centralized Management: A Paradigm Shift

Centralized management is at the heart of any effective modernization. Digital tools alone are not enough. True transformation means consolidating operational functions under a unified platform. Companies are turning away from outdated fragmented approaches–symbolized by hundreds of software offerings that bring with them thousands of user account licenses. Centralization offers the opportunity for a holistic framework that seamlessly integrates data services and operational technologies. What is needed is a centralized IT/OT decision management system that can merge, manage, and monitor a broad array of tools.

Unlocking the Potential: The Economic Benefits of Centralization

The advantages of centralized integrations of this kind are manifold, spanning operational savings, production efficiencies, and quality improvements.

Through centralized monitoring and control, organizations can realize substantial cost savings, enhance production efficiencies, and elevate product quality. More efficient monitoring of operations management tools also facilitates better IT support and ensures continuous uptime, further bolstering operational resilience and minimizing disruptions.

Centralized monitoring and control functions also enable organizations to minimize resource wastage, optimize energy consumption, and streamline operational processes. For example, research indicates that the implementation of SCADA systems can yield cost savings of up to 30 percent, underscoring the economic benefits of this kind of centralized management.

By providing real-time visibility and control over manufacturing processes, centralized monitoring enhances production efficiencies and throughput. Studies suggest that centralized monitoring can lead to throughput improvements of up to 20 percent, highlighting the transformative impact of centralization on operational performance.

Centralized monitoring facilitates proactive quality management and process optimization, resulting in improvements in product quality and consistency. Through real-time quality control measures, organizations can reduce defect rates and enhance customer satisfaction ratings, thereby reinforcing their competitive edge.

Research studies and real-world case studies across various industries corroborate the efficacy of centralized management. The options today are somewhat limited to standardizing on industry leaders such as Siemens, Honeywell or Microsoft. However, there are alternative options that better enable the integration of all these players as well as emerging application providers. Solutions like Userful’s Infinity Platform remain neutral and integration focused, allowing all best of breed applications to be centrally monitored for maximum operational performance.

Regardless of whether companies deploy an industry leader or an open decision support system, the goal remains the same. Centralize IT/OT operations on a given platform. From achieving cost savings to improving efficiency and service quality, centralized integration has become a cornerstone of operational excellence.

From control rooms to operational dashboards, digital signage, and collaboration spaces, an integrated platform for IT/OT decision management brings efficiency, compliance, and innovation across the manufacturing landscape.

The journey towards modernizing manufacturing operations hinges on identifying an integration and harmonization platform for OT and IT operations. Organizations can then unlock new frontiers of efficiency, agility, and productivity.

To learn more about Userful’s Infinity Platform and how it can help with centralized IT/OT operations, visit the website here.

Related News:

Userful’s Data Dashboards Revolutionizes Mission-Critical Healthcare Operations

Enhancements Unveiled for the Userful Infinity Platform

The post Unified Frontiers: Revolutionizing Operations Through Centralized IT/OT Integration appeared first on Digital IT News.

]]>
The Next Evolution of Software Testing: Digital Quality as a Service https://digitalitnews.com/the-next-evolution-of-software-testing-digital-quality-as-a-service/ Fri, 29 Mar 2024 15:00:12 +0000 https://digitalitnews.com/?p=10460 The most successful brands in the world pride themselves on the high level of quality they hold themselves to. Quality is an essential component of any brand’s identity, customer loyalty, and continued success. In today’s increasingly digital world, digital product quality is just as important as that of a physical product. That said, many organizations [...]

The post The Next Evolution of Software Testing: Digital Quality as a Service appeared first on Digital IT News.

]]>
The most successful brands in the world pride themselves on the high level of quality they hold themselves to. Quality is an essential component of any brand’s identity, customer loyalty, and continued success. In today’s increasingly digital world, digital product quality is just as important as that of a physical product.

That said, many organizations struggle with achieving a reliable and consistent level of quality for their digital applications and products. According to recent research, poor software quality cost organizations $2.41 trillion in the U.S. alone in 2022. There are a few reasons behind this digital quality pain point facing brands today, including a lack of both expertise and a consistently executed digital quality strategy. Additionally, because modern digital applications and platforms are often quite complex with multiple interfaces and technologies, it can be a challenge to manage and prioritize quality.

A dedicated focus on digital quality is necessary for companies to achieve the level of quality they have their sights set on for digital products. Quality needs to be built into and prioritized within the software development life cycle (SDLC), and expertise across all quality disciplines (functional, usability, localization, user experience, etc.) must be part of this approach. Unfortunately, the resources and talent required to build and add digital expertise in house can prove to be challenging for many companies. However, this growing demand for digital quality expertise has led to the rise of a new service model, Digital Quality as a Service (DQaaS).

Defining Digital Quality as a Service

Digital Quality as a Service (DQaaS) refers to a managed-service engagement in which a provider commits to achieve a pre-agreed upon level of quality outcomes for a client. The methodology for achieving that outcome is usually left up to the provider, and is likely to involve a mix of elements including exploratory testing, accessibility testing, usability testing, test case execution, and automation.

A provider will write and execute test cases, deliver feedback, and ensure alignment with the client’s roadmap of the product. Aligning and collaborating should provide a frictionless journey where the provider and client are on the same page for level of test coverage and where in the SDLC testing should take place.

Organizations leveraging DQaaS gain access to shared service teams that can support their own software development teams. The shared services team offers consultation, expertise on quality, and provides resources to specific product teams depending on priorities and goals.

Digital Quality as a Service Benefits

The benefits an organization receives from adopting a DQaaS approach to their product development and deployment include:

Expertise: With a shared service team model, an organization isn’t just getting a group of testers to find bugs. These are experts who can assess and prioritize bugs, manage a team of testers and deeply understand different types of testing. They also know how to implement testing approaches and improve processes by adding elements like automation and customer journey testing.

Holistic Approach: Conceptually, DQaaS approaches quality holistically. A shared services team provides experience and expertise across functional, usability, performance, payments, accessibility testing and more. The purpose of the team is to deliver end-to-end quality for organizations, ensuring features and entire products are built up to a certain quality standard and encompassing the entire customer journey.

Broader Quality Perspective: Without being tied to a specific department, a shared services team can see the bigger picture after working with individuals and different teams. Learnings and best practices from one team can then be applied to others, leading to a more aligned, collaborative and educated overall team.

Plug-In Service: Organizations often know they can be doing more when it comes to quality for their software, but they don’t have the resources to hire someone full time to fill in the gaps, or have the expertise to know exactly what needs fixing. A shared service team plugs right into current workflows, providing that expertise and those resources.

Bespoke Strategy: The concept of digital quality is not the same for everyone. While there are frameworks and best practices, a variety of factors, including available resources, Agile teams, features and number of products, all weigh into the overall strategy that goes into quality. A shared services team can advise clients on what the approach to quality should look like for their specific organization.

Crowdtesting and Digital Quality as a Service

Crowdtesting digital applications and products can go hand-in-hand with the implementation of DQaaS. Crowdtesting provides real-world testing with real devices. When this is part of a DQaaS agreement, dedicated resources find and manage bugs across different software types, devices, and locations, while also consulting with QA on next steps and helping to drive testing maturity through the creation of automated regression suites.

Companies leveraging this approach benefit from having real user feedback and perspective during product development, while simultaneously having the ability to improve QA processes to meet the quality requirements they are looking for before a product is pushed live.

Building a Culture of Quality

Consistent product quality goes a long way toward defining a brand and building customer loyalty. Investing in the resources and expertise needed to make sure the highest level of quality is achieved for a product helps to build an overall culture of organizational quality. From individual developers, to QA teams, and business leaders, prioritizing quality and making it part of an organization’s culture is a worthwhile endeavor with the results speaking for themselves.

To learn more about how Applause uses Digital Quality as a Service, visit the website here.

Related News:

Applause 2024 Generative AI Survey Released

18 Artificial Intelligence Predictions for 2024

The post The Next Evolution of Software Testing: Digital Quality as a Service appeared first on Digital IT News.

]]>
Status Update: Change Healthcare Cyber Attack https://digitalitnews.com/status-update-change-healthcare-cyber-attack/ Thu, 21 Mar 2024 17:15:06 +0000 https://digitalitnews.com/?p=10399 The situation following the Change Healthcare Cyber Attack continues to cost the United States healthcare system millions of dollars, as well as affecting the lives of patients nationwide. Millions still have difficulty receiving their prescriptions and connecting with insurance for medical services. After weeks of chaos, the United States government has urged healthcare payers to [...]

The post Status Update: Change Healthcare Cyber Attack appeared first on Digital IT News.

]]>
The situation following the Change Healthcare Cyber Attack continues to cost the United States healthcare system millions of dollars, as well as affecting the lives of patients nationwide. Millions still have difficulty receiving their prescriptions and connecting with insurance for medical services. After weeks of chaos, the United States government has urged healthcare payers to promptly resolve the digital challenges that providers and pharmacies are encountering. Here is all you need to know about the cyber attack to prepare for UnitedHealth’s full return.

Who is Change Healthcare and What Happened?

Change Healthcare, owned by UnitedHealth Group (UHG), is the United States’ largest processor of medical claims and payment cycle management.  In short, they connect payers, providers, and patients with the U.S. healthcare system, handling one in every third patient record. This company processes 15 billion dollars in healthcare transactions annually making it a clear target for outside threats.

On February 21, Change Healthcare discovered an unauthorized party had gained access to multiple of their IT systems. According to their public filing with The Securities and Exchange Commission, the company immediately took action, isolating the impacted systems.

That said, major damage was already done. Hackers had accessed patient data including social security numbers and encrypted company files. The group demanded a hefty ransom to decrypt these sensitive files and threatened to release the data if payment was not received. Since then, Change Healthcare has been offline, causing payment disruptions for tens of thousands of hospitals, physician groups, and other organizations.

The Fallout

Initial reports focused on pharmacies’ inability to fill medications, but three weeks later, the public saw the severity of the issue. The attack has impacted payments to hospitals, physicians, pharmacists, and other healthcare providers across the country. These providers have been left concerned about their ability to care for patients due to the cash flow and coverage uncertainty. However, this has not stopped them. Hospital systems have found workarounds, seeming to take a step back to the stone age of paper documentation. While this has allowed for essential patient care, likely, a significant amount of money won’t be paid out due to form misplacement and the lack of formal authorizations.

“Assuming that between 5% and 10% of U.S. health care claims are affected by the attack, providers are losing between $500 million and $1 billion in daily revenue.” Compass Point analyst Max Reale estimated the impact, “Cash-constrained operators will begin to feel the full brunt of the slowdown in payments for services between late March and early April, assuming it takes about 30 to 45 days to process a claim and receive payment.”

Update: The Response

After the attack on March 1, Optum, the compromised program of Change Healthcare, stepped in to help. They established temporary funding assistance for short-term cash flow needs.

The notice read, “We understand the urgency of resuming payment operations and continuing the flow of payments through the healthcare ecosystem. While we are working to resume standard payment operations, we recognize that some providers who receive payments from payers that were processed by Change Healthcare may need more immediate access to funding.”

Three weeks post-attack, The U.S. Department of Health and Human Services stepped in. They stated, “In a situation such as this, the government and private sector must work together to help providers make payroll and deliver timely care to the American people.”

Further government action ensued, The White House is moving to remove challenges for healthcare providers and address cybersecurity issues. They plan to distribute emergency funds to providers and suppliers facing cash flow issues. In their statement, they called on UnitedHealth and private sector leaders to do the same.

In addition, The Center for Medicare and Medicaid Services(CMS) has taken steps to reduce disruptions by expediting payments for Medicare providers and suppliers.  Specifically, the attack has resulted in a streamlined process for providers to change clearinghouses to ensure payments and insurance plans while preparing the necessary parties for paper claims and submissions. 

These efforts are aimed at supporting all providers, but specifically smaller systems that face existential concerns such as making payroll and supporting their most vulnerable patients.

As for the six terabytes of stolen data, the hackers held it hostage for a staggering price of 22 million dollars. Due to the sensitive nature of the data, the White House urged UnitedHealth Group to quickly give in to the hackers’ demands. While only time will reveal the true cost of this breach, it is clear it will alter the way the United States Medical associations manage their cyber resilience.

Update: The Hackers

Many have reported their suspicions about the hackers’ identity. UnitedHealth suspects the attack was nation-state-associated. The media supports this claim, pointing a finger at ALPHV, also known as BlackCat. This well-known ransomware group has had many names over the years claiming responsibility for other major attacks globally including universities, government agencies and companies in the energy, technology, manufacturing, and transportation sectors. A recent notable attack was the Colonial Pipeline shutdown in 2022. Their hack and rebrand practice has made them the target of law enforcement agencies worldwide. 

Since payment was posted, BlackCat has shut down all of its servers and ransomware sites. In fact, on March 4, when payment was processed, the group uploaded a fake law enforcement seizure banner.

Security researcher Fabian Wosar commented, “BlackCat did not get seized. They are ‘exit scamming’ their affiliates.” And exit scamming they were.

Assumed BlackCat actors claimed their associates screwed them over, and as a response, they intend to sell the ransomware’s source code for 5 million dollars. 

Update: On the Lookout

There is no real way to know if any of the stolen data was leaked or if the ransomware’s source code will be used again. This makes it vital to increase all organizations’ cyber resilience and keep on the lookout for ALPHV/BlackCat’s rebranded comeback.

Since the hack, the company has been working diligently to safely return online. On March 7, the company restored 99% of Change Healthcare pharmacy network services and on March 15, Change Healthcare’s electronic payments platform began proceeding with payer implementations. The company has scheduled further network testing and software checks starting on March 18.

Protecting Your Organization

This hack reminds all of us how volatile our systems can be, and how important it is to remain proactive with security. Digital IT News received commentary from Netwrix’s VP of Security Research, Dirk Schrader regarding the best way to protect your organization from threat actors such as BlackCat.

“High dependency of our day-to-day living on proper functioning supply chains is our reality. High-profile attacks affect hundreds of thousands of individuals. Colonial Pipeline or MoveIT stories, attacks on IT service providers like Kaseya and Materna, to name a few, might vary in scale and vertical, but all of them prove the need for a coordinated approach to increase the cyber resiliency of vital services like healthcare, energy, water, transportation, etc. “The domino effect of an infiltration of the supply chain can be devastating. Cyber resilience is defined as the ability to deliver the intended outcome despite adverse cyber events, and critical infrastructure is not limited to internal security incidents.

He later outlined precautions, “Organizations that are part of a critical infrastructure should pay special attention to ensuring they might effectively operate under the ongoing attack and regularly assess the risks associated with their supply chain.” He recommended all third-party dependencies should implement, or reexamine a response plan to cover scenarios such as these. 

This hack has reminded the world how imperative strong cyber security truly is. Looking forward, we have sneaking suspicions this breach will permanently alter how healthcare needs will be processed and secured.

The post Status Update: Change Healthcare Cyber Attack appeared first on Digital IT News.

]]>