Big Data – Digital IT News https://digitalitnews.com IT news, trends and viewpoints for a digital world Mon, 27 Sep 2021 19:19:52 +0000 en-US hourly 1 https://wordpress.org/?v=5.4.15 MemVerge and Hazelcast to Co-Develop Big Memory Solutions for Financial Services https://digitalitnews.com/memverge-and-hazelcast-to-co-develop-big-memory-solutions-for-financial-services/ Fri, 25 Jun 2021 19:01:37 +0000 https://digitalitnews.com/?p=4359  MemVerge™, the pioneers of Big Memory software, and Hazelcast, the fast cloud application platform, announced a development and marketing partnership. The Hazelcast platform, featuring real-time streaming and memory-first database technologies, is being integrated with MemVerge memory virtualization and in-memory snapshot technologies to unlock Big Memory capacity and fault-tolerance. The technologies and software will be shaped into [...]

The post MemVerge and Hazelcast to Co-Develop Big Memory Solutions for Financial Services appeared first on Digital IT News.

]]>
 MemVerge™, the pioneers of Big Memory software, and Hazelcast, the fast cloud application platform, announced a development and marketing partnership. The Hazelcast platform, featuring real-time streaming and memory-first database technologies, is being integrated with MemVerge memory virtualization and in-memory snapshot technologies to unlock Big Memory capacity and fault-tolerance.

The technologies and software will be shaped into Big Memory reference architectures for Financial Services starting with Basel III Compliance solutions that will be introduced in the second half of 2021.

“Financial institutions are reporting that analytics jobs needed for Basel III compliance are growing in size and complexity,” said Kelly Herrell, CEO of Hazelcast. “The reference architectures we develop with MemVerge will enable our customers to transform processes, accelerate the completion of jobs and increase the reliability of applications to capture value at every moment.”

For IT organizations, channel partners, and vendors who need quick access to a Basel III Compliance solution, five Big Memory labs will be available for demonstrations, proof-of-concept testing, and software integration. Visit the Big Memory Lab page at memverge.com for information about capabilities and scheduling time in each lab: Arrow, Intel, MemVerge, Penguin Computing, StorageReview.com, and WWT.

According to Charles Fan, CEO of MemVerge, “The Hazelcast platform excels at large-scale computational jobs such as executing and scaling financial risk calculations. MemVerge Memory Machine™ software complements Hazelcast with the ability to scale-up memory capacity to terabytes per server with in-memory snapshots for fast crash recovery of an entire cluster. The combined power of Hazelcast and MemVerge will deliver superior performance and reliability to our joint customers in financial services.”

Image licensed by: Pixabay.com

Related News: 

Western Digital Flash Innovations Unlock Powerful New Experiences for Next-Generation 5G Smartphone Users

Unleash Power and Productivity with Lenovo ThinkPad and ThinkVision

The post MemVerge and Hazelcast to Co-Develop Big Memory Solutions for Financial Services appeared first on Digital IT News.

]]>
New Survey Shows Enterprises Adopting Kubernetes for Big Data Cloud Migration, Cost Reduction https://digitalitnews.com/new-survey-shows-enterprises-adopting-kubernetes-for-big-data-cloud-migration-cost-reduction/ Wed, 05 May 2021 16:27:07 +0000 https://digitalitnews.com/?p=4032 Pepperdata, the leader in big data performance management, announced the findings from a survey of enterprise IT professionals to understand how companies are meeting their big data applications needs through the use of Kubernetes. The survey was conducted in March 2021, among 800 participants from a range of industries, 72 percent of which worked at [...]

The post New Survey Shows Enterprises Adopting Kubernetes for Big Data Cloud Migration, Cost Reduction appeared first on Digital IT News.

]]>
Pepperdata, the leader in big data performance management, announced the findings from a survey of enterprise IT professionals to understand how companies are meeting their big data applications needs through the use of Kubernetes. The survey was conducted in March 2021, among 800 participants from a range of industries, 72 percent of which worked at companies with between 500 and 5000 employees.

Kubernetes is rapidly becoming the standard for cloud and on-premises clusters. It is a complex technology, and companies are struggling to properly and effectively implement and manage these environments. The complexity of big data applications makes resource optimization a real challenge. Unsurprisingly, when IT doesn’t have granular visibility into big data Kubernetes performance, optimized performance and spend are hard to achieve.

“Kubernetes is increasingly being adopted by our customers for big data applications. As a result, we see customers experiencing performance challenges,” said Ash Munshi, CEO, Pepperdata. “This survey clearly indicates that these problems are universal and there is a need to better optimize these big data workloads.”

Survey data collected reveals a number of insights into how businesses are adopting Kubernetes for big data applications:

  • When asked what their goals were for adopting Kubernetes for big data workloads, 30 percent said to “improve resource utilization for reduced cloud costs.” 23 percent want to enable their migration to the cloud; 18 percent said to shorten deployment cycles; 15 percent wanted to make their platforms and applications cloud-agnostic; and 14 percent said to containerize monolithic apps.
  • Porting hundreds or thousands of apps over to Kubernetes can be challenging, and the biggest hurdles for survey respondents included initial deployment, followed by migration, monitoring and alerting, complexity and increased cost, and reliability, in that order.
  • The kinds of applications and workloads respondents are running, in order of most to least, include Spark, 30 percent; Kafka, 25 percent; Presto 23 percent; AI/deep learning workloads using PyTorch or Tensorflow at 18 percent; and “other” at five percent.
  • Surprisingly, and despite how much the media writes about the move to public cloud, this survey found that 47 percent of respondents are using Kubernetes in private cloud environments. On-premises use made up 35 percent, and just 18 percent of respondents said they were using Kubernetes containers in public cloud environments.
  • 45% of Kubernetes workloads are in development and testing environments, as users move production workloads into a new resource management framework. 30 percent are doing proof-of-concept work.
  • 66 percent of respondents said 75–100 percent of their big data workloads will be on Kubernetes by the end of 2021.
  • IT operations was the clear leader—at 80 percent—in deploying Spark and other big data apps built on Kubernetes; Engineering followed with 11 percent; with business unit developers at just nine percent.

View the full Pepperdata Big Data and Kubernetes report

Image licensed by: Pixabay.com

Related News:

PC Matic Survey Finds Majority of Americans Lack Confidence in U.S. Federal Government’s Cybersecurity Preparedness

Portworx Enhances PX-Backup to Enable Secure, Self-Service Experience for Protecting Kubernetes Applications

The post New Survey Shows Enterprises Adopting Kubernetes for Big Data Cloud Migration, Cost Reduction appeared first on Digital IT News.

]]>
ThinkIQ Announces VisualOps Solutions to Suite of Products https://digitalitnews.com/thinkiq-announces-visualops-solutions-to-suite-of-products/ Tue, 04 May 2021 18:32:55 +0000 https://digitalitnews.com/?p=4027 ThinkIQ, a pioneer of digital manufacturing transformation SaaS, announced a new solution VisualOps, which is designed to help organizations obtain easy access to data from a material view, new visibility, and a path towards Industry 4.0 Manufacturing. ThinkIQ VisualOps™ was created as a second step for companies on the path to Industry 4.0 Manufacturing. The [...]

The post ThinkIQ Announces VisualOps Solutions to Suite of Products appeared first on Digital IT News.

]]>
ThinkIQ, a pioneer of digital manufacturing transformation SaaS, announced a new solution VisualOps, which is designed to help organizations obtain easy access to data from a material view, new visibility, and a path towards Industry 4.0 Manufacturing.

ThinkIQ VisualOps™ was created as a second step for companies on the path to Industry 4.0 Manufacturing. The benefits include having your data standardized and available in one location, empowering manufacturing leaders, plant managers, process and data engineers and operators, by allowing them to explore their manufacturing and supply chain data within the context of their business. The new function can also start the process of creating alerts and notifications that may bring problems to immediate attention.

“The addition of VisualOps allows customers to start the journey of monetizing their manufacturing and supply chain data using an Industry 4.0 Platform that will help them achieve their digital transformation goals,” said Niels Andersen, CTO and CPO of ThinkIQ. “This product will help organizations obtain the benefits of Industry 4.0 and lead them on the path to Smart Manufacturing.”

Some of the additional benefits of ThinkIQ VisualOps include:

  • Ability to move companies past raw data to being able to explore, compare, and be aware of the data — with standardized metrics and views to bring wide visibility and context to what is currently just digital bits.
  • Allows organizations to harness the power of what are mostly disconnected existing data streams from IoT, IIoT, HMIs, PLCs, CRM, MES, digitized manual data, and partner data, all into one single location.
  • Includes on-premise gateways & connectors to centralize the data and securely send this data to the cloud, and most clients don’t need to add any new hardware or software to their existing environment.
  • Software includes sourcing existing data from Automation, IoT and IIoT, CRM, and other digital captures, and also includes an equipment profile library, equipment modeling, manufacturing process layout, trending, standardized dashboards, and basic limits & notifications.

ThinkIQ’s SaaS Manufacturing cloud-based platform simplifies the creation of web-based applications and leverages the strengths of the Internet of Things, Big Data, Data Science, Semantic Modeling and Machine Learning. The platform collects data across the operation (existing and IIoT sensors) and leverages AI, ML to provide actionable real time insights (e.g., identify correlations and root causes, traceability and yield issues, etc.). It creates a new level of capability beyond what independent disconnected operating environments can provide today.

For more information about how ThinkIQ helps transform digital manufacturing operations by looking at material flow analysis instead of just IIoT equipment analytics, visit our website.

Image Licensed by: Pixabay.com

Related News:

Portworx Enhances PX-Backup to Enable Secure, Self-Service Experience for Protecting Kubernetes Applications

New SAS Viya offerings help better manage and navigate big data for AI and analytics

The post ThinkIQ Announces VisualOps Solutions to Suite of Products appeared first on Digital IT News.

]]>
New SAS Viya offerings help better manage and navigate big data for AI and analytics https://digitalitnews.com/new-sas-viya-offerings-help-better-manage-and-navigate-big-data-for-ai-and-analytics/ Mon, 03 May 2021 18:29:40 +0000 https://digitalitnews.com/?p=4017 Diverse and complex data environments create a challenge for organizations seeking to operationalize analytics to enable the best business decisions. Global analytics leader SAS continues to enhance its powerful, cloud-native SAS® Viya® platform with the inclusion of new data management solutions to further strengthen the foundation for data and analytic success. “Analytics help organizations harness big data and [...]

The post New SAS Viya offerings help better manage and navigate big data for AI and analytics appeared first on Digital IT News.

]]>
Diverse and complex data environments create a challenge for organizations seeking to operationalize analytics to enable the best business decisions. Global analytics leader SAS continues to enhance its powerful, cloud-native SAS® Viya® platform with the inclusion of new data management solutions to further strengthen the foundation for data and analytic success.

“Analytics help organizations harness big data and use it to identify new opportunities. And the starting point for effective analytics is applying new techniques to discover, prepare and govern data,” said Tapan Patel, Senior Marketing Manager for AI and Analytics at SAS. “New SAS Viya data management offerings allow businesses to catalog assets across the cloud and on-premises and prepare data without hassle. These are fundamental elements of a successful analytic journey.”

Most analytics professionals spend too much time on data preparation rather than building models and delivering analytic insights for their organizations. Whether data is simple or complex, organizations need innovative solutions that cut through the chaos, provide flexibility and efficiently transform data to create better analytical assets. Today, SAS is announcing two such innovative solutions.

SAS Studio Analyst
As analytics and AI become more embedded in day-to-day operations, organizations need data in the right place, in the right condition and at the right time.

Now available in SAS Viya, SAS Studio Analyst empowers data scientists and data analysts with a self-service environment to expedite delivery of trusted data for analytics. SAS Studio Analyst allows users to visually represent, prebuilt or custom, data quality and data preparation steps as part of the flow, which can be easily reused and managed. This helps to jump-start analytics initiatives and efficiently orchestrate data needed for it.

SAS Information Governance
In a world with endless data sources, many organizations struggle to grasp all data that is available, identify the contents of a data set, and evaluate data for analytics fitness. And as the amount of data increases from more and more sources, these issues are compounded, affecting the value of analytical outputs.

SAS Information Governance, also now available in SAS Viya, provides an intuitive data catalog and metadata search facility for analyst, business and IT users to ingest, discover and manage data resources, while simultaneously governing and protecting data. With the solution, data professionals can spend less time searching and organizing data, and more time on actual analysis.

Better data management and governance improves people’s health
Riverside County in Southern California relies on data integration and analytics from SAS to improve the health and well-being of more than 2.5 million residents. The county faced the challenge of pulling together innumerable amounts of data from multiple sources. The goal is to better serve the needs of its diverse and large population through the county’s health system, which includes a large medical center, an inpatient psychiatric facility and several health care clinics.

To ensure the most vulnerable populations receive coordinated medical, behavioral health and socioeconomic services, Riverside County worked with SAS to design a solution that combines data preparation, advanced analytics and data visualization capabilities. The solution, powered by SAS Viya, enables the county to integrate health and non-health data from its public hospital, county jail, and behavioral health, social services and homelessness systems.

By connecting these databases, Riverside County can now see how individuals interact with different services across its health system and map care pathways to health outcomes. This is enhanced by entity resolution, a feature of the technology that enables the county to identify unique entities, even if a person’s name, address or other personal identifiers don’t match up across different databases.

“Before SAS, we had a massive data integration problem,” said Judi Nightingale, Director of Population Health at Riverside County. “If we can get an integrated look at every client and reduce our siloed efforts, we can get everyone in the county to their health and wellness goals more quickly.”

Read the full story of how Riverside County tapped SAS Viya to improve the health and well-being of vulnerable Californians at sas.com/en_us/customers/riverside-county.html.

The latest release of SAS Viya includes the new data management offerings – SAS Studio Analyst and SAS Information Governance – and is designed to be delivered and updated continuously. The cloud-native software enables customers to efficiently democratize analytics throughout their organizations, while seamlessly managing analytic workloads and embedding analytics into variety of operational applications for making confident decisions.

Image Licensed by: Pixabay.com

Related News: 

Internet of Things (IoT) Security Market will Accelerate at a CAGR of over 30%|Technavio

EVgo and General Motors Unveil First Fast Charging Stations from Landmark EV Charging Infrastructure Collaboration

The post New SAS Viya offerings help better manage and navigate big data for AI and analytics appeared first on Digital IT News.

]]>
Twitter Expands Strategic Partnership with Google Cloud to Improve Data Insights and Enhance Productivity https://digitalitnews.com/twitter-expands-strategic-partnership-with-google-cloud-to-improve-data-insights-and-enhance-productivity/ Fri, 05 Feb 2021 18:26:21 +0000 https://digitalitnews.com/?p=3543 Google Cloud today announced a new, multi-year, strategic partnership with Twitter. The company will deepen its initial work with Google and move its offline analytics, data processing, and machine learning workloads to Google’s Data Cloud. This will allow Twitter to analyze data faster and improve the experience for people who use the service every day. Behind [...]

The post Twitter Expands Strategic Partnership with Google Cloud to Improve Data Insights and Enhance Productivity appeared first on Digital IT News.

]]>
Google Cloud today announced a new, multi-year, strategic partnership with Twitter. The company will deepen its initial work with Google and move its offline analytics, data processing, and machine learning workloads to Google’s Data Cloud. This will allow Twitter to analyze data faster and improve the experience for people who use the service every day.

Behind every Tweet, Like and Retweet is a series of data points that helps teams understand how people are using the service, and what types of content they might want to see. To process all of this information, Twitter’s data platform ingests trillions of events, processes hundreds of petabytes of data, and runs tens of thousands of jobs on over a dozen clusters every day. With this expanded partnership, Twitter is adopting Google’s Data Cloud including BigQuery, Dataflow, Cloud Bigtable and machine learning (ML) tools. These tools not only power the company’s rapidly growing data ecosystem to enable faster data-informed decisions, but also to enable deeper ML-driven product innovation.

Using Google’s Data Cloud, Twitter will be able to democratize data access by offering a range of data processing and machine learning tools to better understand and improve how Twitter features are used.  Previously, engineers and data scientists often developed large custom data processing jobs, which can now be queried faster using SQL in BigQuery. This will make it easier for both technical and non-technical teams to study data and accelerate the time to insight.

“Our initial partnership with Google Cloud has been successful and enabled us to enhance the productivity of our engineering teams. Building on this relationship and Google’s technologies will allow us to learn more from our data, move faster and serve more relevant content to the people who use our service every day. As Twitter continues to scale, we’re excited to partner with Google on more industry-leading technology innovation in the data and machine learning space,” said Parag Agrawal, CTO, Twitter.

“Helping customers manage the entire continuum of data – from storage to analytics to AI – is one of our key differentiators at Google Cloud,” said Thomas Kurian, CEO of Google Cloud. “It’s been phenomenal to watch this company grow over the years, and we’re excited to partner with Twitter to innovate for the future and deliver the best experience possible for the people that use Twitter every day.”

Image Licensed by Google.com

Related News: 

Qualys Introduces SaaS Detection and Response to Manage the Security Posture and Risk of the SaaS Application Stack

New Report: Top Three Ways to Drive Boardroom Engagement around Cybersecurity Strategy

The post Twitter Expands Strategic Partnership with Google Cloud to Improve Data Insights and Enhance Productivity appeared first on Digital IT News.

]]>
Survey Reveals One Third of Businesses are Exceeding Their Cloud Budgets by as Much as 40 Percent https://digitalitnews.com/survey-reveals-one-third-of-businesses-are-exceeding-their-cloud-budgets-by-as-much-as-40-percent/ Wed, 27 Jan 2021 18:37:45 +0000 https://digitalitnews.com/?p=3493  Pepperdata, the leader in Analytics Stack Performance, announced the results of a new survey of 750 senior enterprise IT professionals in industries ranging from finance to healthcare, automotive, advertising and other data-intensive businesses. Key findings include that more than one-third of businesses have cloud budget overruns of up to 40 percent, and one in 12 [...]

The post Survey Reveals One Third of Businesses are Exceeding Their Cloud Budgets by as Much as 40 Percent appeared first on Digital IT News.

]]>
 Pepperdata, the leader in Analytics Stack Performance, announced the results of a new survey of 750 senior enterprise IT professionals in industries ranging from finance to healthcare, automotive, advertising and other data-intensive businesses. Key findings include that more than one-third of businesses have cloud budget overruns of up to 40 percent, and one in 12 companies exceed this number.

The survey was conducted to better understand how organizations run their big data applications and workloads in the cloud. Companies polled ranged in size from 500 to more than 5000 employees, and spent from $500,000 to more than $10M on big data analytics.

The shift to cloud computing is solidly underway. While the cloud offers both the benefits of a pay-as-you go model and the ability to be elastic on demand, enterprises almost universally see a rise in costs. This is because IT often lacks sufficient visibility into cloud performance and does not have the tools to optimize applications.

Key findings from the survey include:

  • For 64% of respondents, “cost management and containment” is their biggest concern with running cloud big data technologies and applications.
  • A majority of respondents said the desire to “better optimize current cloud resources” was their highest priority big data cloud initiative.
  • In 2020, for one in three respondents, cloud spend was projected to be over budget by between 20 percent and 40 percent.
  • One in 12 respondents said their cloud spend was expected to be over budget by more than 40 percent.

“This research shows us the importance of visibility into big data workloads. It also highlights the need for automated optimization as a means to control runaway costs,” said Ash Munshi, CEO, Pepperdata. “Significantly more than half the companies surveyed indicate lack of control on cloud spend, making optimization and visibility the keys to getting costs under control.”

Other findings included:

  • Types of cloud: 47 percent of respondents are using a private cloud; 21 percent are in the public cloud; 28 percent are using a combination of both.
  • Biggest concerns when running big data applications in the cloud: 39 percent of respondents cited cost management and containment; 33 percent said increased complexity; 14 percent said CapEx to OpEx; 13 percent said a lack of control.
  • When asked how budgets were impacted by the move to the cloud, 40 percent stated that budgets stayed the same and resided with IT; 31 percent said budget became shared with IT administering; 30 percent said budgets moved into one or multiple business units.
  • When asked how their enterprise measures application/workload performance based on their cloud instance, 30 percent said they were using an application performance monitoring solution; 28 percent said they used cloud provider tools; almost 20 percent said a homegrown solution; 17 percent said an observability solution for insights, alerts and automation; five percent said they don’t monitor at all.
  • Moving to the cloud often means confusion as to which part of the organization owns support and troubleshooting for big data applications: 44 percent said they had a shared support model between ITOps and business units/LOB developers; 35 percent said support stays with ITOps; 22 percent said the dev organization within the business units owns this.
  • Spend: Asked to estimate what they will spend this year on big data analytics in the cloud, 34 percent said between $500,000 and $1M; 26 percent said between $1M and $2M; 15 percent said between $2M and $10M; seven percent said more than $10M.
  • Budgets: For 2020, 44 percent said they’d be on budget; 35 percent expected to exceed budget by 20 to 40 percent; eight percent of respondents believed they’d exceed budget by more than 40 percent.

Cloud optimization delivers big savings. According to Google, even minimal cloud optimization efforts can net a business as much as 10% savings per service within two weeks. Cloud services that are fully optimized and running on extended periods (over six weeks) can save more than 20%. Key findings surrounding cloud optimization were:

  • Top big data cloud initiatives included better optimizing current cloud resources and continuing to migrate workloads into the cloud. Lower-priority initiatives included increasing visibility into app and workload performance; expanding to other public cloud vendors; more use of containers; and finding a reliable replacement for Hadoop.
  • The types of applications and workloads that consumed the most resources, according to respondents were: Hive, with 29 percent of the vote; Spark, at 27 percent; MapReduce, at 16 percent; Tez, at 11 percent; other applications, 18 percent.

To cut the waste out of IT operations processes and achieve true cloud optimization, enterprises need observability and automated tuning. This requires machine learning and a unified analytics stack performance platform. Such a setup equips IT operations teams with the cloud tools they need to keep their infrastructure running optimally, while minimizing spend.

View the full Pepperdata Big Data Survey Report

Image licensed by unsplash.com

Related News:

SAP and Microsoft Expand Partnership and Integrate Microsoft Teams Across Solutions

Palo Alto Networks Achieves New FedRAMP Authorization including Prisma Cloud, Cortex XDR and Cortex Data Lake

The post Survey Reveals One Third of Businesses are Exceeding Their Cloud Budgets by as Much as 40 Percent appeared first on Digital IT News.

]]>
HelpSystems Acquires FileCatalyst to Continue Expansion of Cybersecurity and Automation Portfolio https://digitalitnews.com/helpsystems-acquires-filecatalyst-to-continue-expansion-of-cybersecurity-and-automation-portfolio/ Fri, 08 Jan 2021 21:32:23 +0000 https://digitalitnews.com/?p=3344 HelpSystems announced the acquisition of FileCatalyst, a leader in enterprise file transfer acceleration. FileCatalyst enables organizations working with extremely large files to optimize and transfer their information swiftly and securely across global networks. This can be particularly beneficial in industries such as broadcast media and live sports. With the increasing need to share video and [...]

The post HelpSystems Acquires FileCatalyst to Continue Expansion of Cybersecurity and Automation Portfolio appeared first on Digital IT News.

]]>
HelpSystems announced the acquisition of FileCatalyst, a leader in enterprise file transfer acceleration. FileCatalyst enables organizations working with extremely large files to optimize and transfer their information swiftly and securely across global networks. This can be particularly beneficial in industries such as broadcast media and live sports.

With the increasing need to share video and other media-rich files, big data, and extensive databases, many businesses struggle with technology and bandwidth constraints. FileCatalyst solves these challenges by enabling files to move at speeds hundreds of times faster than what FTP allows, while ensuring secure and reliable delivery and tracking. This empowers businesses to work more efficiently, without the latency and packet loss that can plague the movement of vast amounts of information when it comes to content distribution, file sharing, and offsite backups.

“Our customers and partners have expressed a growing need to move significant volumes of data more quickly than ever before, and FileCatalyst addresses this problem effectively for many well-known organizations,” said Kate Bolseth, CEO, HelpSystems. “FileCatalyst is an excellent addition to our managed file transfer and robotic process automation offerings, and we are pleased to bring the FileCatalyst team and their strong file acceleration knowledge into the global HelpSystems family.”

“We are thrilled to become part of a company with deep roots and expertise in both cybersecurity and automation,” said Chris Bailey, CEO and Co-Founder, FileCatalyst. “Our customers will find value in pairing our file transfer acceleration solutions with HelpSystems’ extensive solution suites.”

Image licensed by Unsplash.com

Related News:

Cisco to Acquire Portshift to Enhance Application Security

Google Cloud Announces Database Migration Service

 

The post HelpSystems Acquires FileCatalyst to Continue Expansion of Cybersecurity and Automation Portfolio appeared first on Digital IT News.

]]>