IT Management Archives | eWEEK https://www.eweek.com/it-management/ Technology News, Tech Product Reviews, Research and Enterprise Analysis Thu, 06 Jun 2024 11:37:51 +0000 en-US hourly 1 https://wordpress.org/?v=6.5.3 Azure Synapse vs. Databricks: Data Platform Comparison 2024 https://www.eweek.com/big-data-and-analytics/azure-synapse-vs-databricks/ Tue, 26 Mar 2024 13:00:11 +0000 https://www.eweek.com/?p=221236 Compare Azure Synapse and Databricks for your data needs. Explore features, performance, and use cases to make an informed decision.

The post Azure Synapse vs. Databricks: Data Platform Comparison 2024 appeared first on eWEEK.

]]>
Both Microsoft Azure Synapse and Databricks are well-respected data platforms that provide the volume, speed, and quality demanded by leading data analytics and business intelligence solutions. They both serve an urgent need in the modern business world, where data analytics and management have become more important than ever.

  • Azure Synapse: Best for unified data analytics across big data systems and data warehouses.
  • Databricks: Best for use cases such as streaming, machine learning, and data science-based analytics.

Continue reading to see how Azure Synapse and Databricks stack up against each other in terms of pricing, features, implementation, AI, security, and integration.

Azure Synapse vs. Databricks at a Glance

Azure Synapse Databricks
Price Flexible detailed pricing; pay-as-you-go; options for savings with pre-purchased units. Flexible pay-as-you-go; 14-day free trial.
Core Features
  • Scale and query flexibility.
  • Integrated ML and BI.
  • Unified analytics workspace.
  • Real-time insights with Synapse Link.
  • Advanced security and privacy.
  • Data sharing.
  • Data engineering.
  • Comprehensive data governance.
  • Advanced data warehousing.
  • AI and ML.
Ease of Implementation Seamlessly with other Azure services; familiar for users in Microsoft’s cloud ecosystem. Offers a collaborative environment with interactive notebooks but may require familiarity with Apache Spark for optimal use.
AI and ML Integrates with Azure Machine Learning and Power BI, providing tools for machine learning projects and business intelligence. Excels in machine learning and AI with optimized Spark engine and tools like MLflow for managing the ML life cycle.
Architecture SQL-based data warehousing with big data integration, optimized for large datasets and complex queries. Data lake architecture leveraging Apache Spark for distributed data processing and machine learning workloads.
Processing and Performance Optimizes querying with automatic scaling and performance tuning, leveraging serverless SQL pools for dynamic resource allocation. Parallel computation, efficient data ingestion and access patterns, and optimized for large data sets with the Photon engine.
Security Features advanced security and privacy controls like automated threat detection, always-on encryption, and fine-grained access control. Provides robust security features, including role-based access control and automatic encryption, with a focus on collaborative environments.
Integration Extensive with Azure and third-party solutions. Wide range; supports major data storage providers.

Microsoft Azure Synapse icon.

Azure Synapse Overview

Azure Synapse, previously known as Microsoft Azure SQL Data Warehouse, integrates big data and data warehousing into a single platform.

Its architecture is built on a strong SQL foundation, designed to handle large volumes of data through massively parallel processing. This approach allows Synapse to deliver rapid processing without solely relying on expensive memory, utilizing clustered and nonclustered column store indexes to efficiently manage data storage and distribution.

Key Features

  • Limitless scale and query flexibility: Azure Synapse can handle massive datasets without compromising performance, as users can query data across various sources, including data warehouses, data lakes, and big data analytics systems, using both relational and nonrelational data in their preferred language. This feature is particularly beneficial for organizations with diverse data ecosystems as they likely need seamless integration and analysis of all data types.
Azure Synapse chart view of insights from multiple data sources.
Azure Synapse chart view of insights from multiple data sources.
  • Integrated machine learning and BI: The integration with Power BI and Azure Machine Learning empowers users to discover insights across all data. Practitioners can apply machine learning models directly within their apps, significantly reducing the development time for BI and ML projects. This democratizes advanced analytics and allows users to leverage intelligence across all critical data, including third-party datasets, and enhance decision-making processes.
Insights of a sales dashboard powered by Azure ML and Power BI integration.
Insights of a sales dashboard powered by Azure ML and Power BI integration.
  • Unified analytics workspace: Synapse Studio offers a comprehensive workspace for various data tasks, from data prep and management to data warehousing and artificial intelligence. Its code-free environment for data pipeline management, coupled with automated query optimization and seamless Power BI integration, streamlines project workflows. Teams looking to collaborate efficiently on analytics solutions, from data engineers and scientists to business analysts, will appreciate this capability.
Selecting a Synapse Analytics workspace.
Selecting a Synapse Analytics workspace.
  • Real-time insights with Azure Synapse Link: Azure Synapse Link eliminates traditional ETL (extract, transform and load) bottlenecks by providing near-real-time data integration from operational databases and business applications to Azure Synapse Analytics. Organizations can achieve an end-to-end business view more quickly and efficiently, which gives rise to a data-driven culture by democratizing data access across teams.
  • Advanced security and privacy: Azure Synapse ensures data protection with state-of-the-art security features, including automated threat detection and always-on encryption. Fine-grained access controls, such as column-level and row-level security, encryption, and dynamic data masking, safeguard sensitive information in real time. This thorough approach to security, backed by Microsoft’s significant investment in cybersecurity, provides peace of mind for organizations concerned with data privacy and compliance.

Pros

  • Ideal for analytics with its comprehensive analytics service.
  • Offers data protection, access control, and network security features.
  • Scalability through massively parallel processing, enabling efficient performance optimization.
  • Delivers deep integration with Azure services for enhanced data management and analytics workflows.

Cons

  • Can be complex due to its broad range of features.
  • Pricing depends on various factors, like the number of data warehouse units and the amount of data stored.
  • High-performance configurations can significantly consume resources.
  • While powerful within the Azure ecosystem, it may be less flexible outside of it.

Databricks icon.

Databricks Overview

Databricks, founded on Apache Spark, offers a unified analytics platform that emphasizes machine learning and AI-driven analytics. Positioned more as a data lake than a traditional data warehouse, Databricks excels in handling raw, unprocessed data at scale. Its SaaS delivery model across AWS, Azure, and Google Cloud provides flexibility and scalability to serve a vast range of data processing and analytics needs.

Key Features

  • Data Sharing with Delta Sharing: Databricks allows secure data sharing with Delta Sharing, enabling data and AI asset sharing within and outside organizations. This feature is crucial for businesses looking to collaborate on data projects across different platforms, enhancing data accessibility and collaboration.
Open marketplace enabling users to share their assets.
Open marketplace enabling users to share their assets.
  • Data engineering: Databricks excels in data engineering, offering robust tools for data preprocessing and transformation. This is essential for organizations focusing on developing machine learning models, ensuring data is in the right format and quality for analysis.
Data science and engineering dashboard in Databricks’ community edition.
Data science and engineering dashboard in Databricks’ community edition.
  • Comprehensive data governance: With features like data cataloging and quality checks, Databricks ensures data is clean, cataloged, and compliant, making it discoverable and usable across the organization. This is vital for companies aiming to maintain high data quality and governance standards.
  • Advanced data warehousing: Databricks brings cloud data warehousing capabilities to data lakes with its lakehouse architecture, allowing modeling of a cost-effective data warehouse on the data lake. This suits businesses looking for scalable and efficient data warehousing solutions.
  • Artificial intelligence and machine learning: Databricks provides a vast platform for AI and ML, including support for deep learning libraries and large language models. Users can monitor data, features, and AI models in one place, which is useful for organizations looking to leverage AI and ML for advanced analytics and insights.
A dashboard monitoring financial transactions.
A dashboard monitoring financial transactions.

Pros

  • Robust support for machine learning and AI projects with integrated tools like MLflow.
  • Built on Apache Spark, ensuring high performance for data processing tasks.
  • Available on AWS, Azure, and Google Cloud, providing deployment flexibility.
  • Shared notebooks facilitate collaboration and boost productivity of data teams.

Cons

  • Aimed at a technical market, it may appear complex and not user-friendly.
  • Requires more manual input for tasks like cluster resizing or configuration updates.
  • Can be costly for extensive data processing and storage needs.
  • Integrating with existing data systems and workflows may need significant effort.

Best for Pricing: Databricks

When comparing the pricing models of Azure Synapse and Databricks, Databricks offers a more accessible entry point with its 14-day free trial, which includes a collaborative environment for data teams and interactive notebooks supporting a wide range of technologies. Its products employ a pay-as-you-go model that ranges between a starting price of $0.07 per Databricks Unit and $0.40 per Databricks Unit.

Azure Synapse, on the other hand, provides a detailed pricing structure that includes options for pre-purchasing Synapse Commit Units (SCUs) for savings over pay-as-you-go prices, with discounts up to 28%.

Pricing for Azure Synapse is based on various factors, including data pipeline activities, integration runtime hours, and data storage, with specific charges for serverless and dedicated consumption models.

While Azure Synapse offers a comprehensive and scalable solution, the complexity of its pricing model and the potential costs associated with large-scale data warehousing and data analytics workloads might make Databricks a more cost-effective option for teams just starting out or with variable usage patterns.

Best for Core Features: Azure Synapse

Azure Synapse offers a comprehensive suite of analytics services that integrate enterprise data warehousing and big data processing. Its core features include limitless scale for querying data, integration with Power BI and Azure Machine Learning for expanded insights, and a unified analytics workspace in Synapse Studio for data prep, management, and exploration.

These capabilities make Azure Synapse particularly well-suited for teams that want a robust platform that can handle extensive data warehousing and analytics tasks within the Azure ecosystem.

Databricks positions itself as more of a data lake than a data warehouse. Thus, the emphasis is more on use cases such as streaming, machine learning, and data science-based analytics. It can be used to handle raw unprocessed data in large volumes.

For those wanting a top-class data warehouse for analytics, Azure Synapse wins. But for those needing more robust ELT (extract, load, transform), data science, and machine learning features, Databricks is the winner.

Best for Ease of Implementation: Azure Synapse

Synapse’s reliance on SQL and Azure offers familiarity to the many companies and developers who use those platforms around the world. For them, it is easy to use. Similarly, Databricks is perfect for those used to Apache tools. But Databricks does take a data science approach, using open-source and machine libraries, which may be challenging for some users.

Databricks can run Python, Spark Scholar, SQL, NC SQL, and other platforms. It comes packaged with its own user interface as well as ways to connect to endpoints such as JDBC connectors. Some users, though, report that it can appear complex and not user friendly as it is aimed at a technical market and needs more manual input for cluster resizing or configuration updates. There may be a steep learning curve for some.

Best for Machine Learning & AI: Databricks

Databricks beats Azure in this category with its Mosaic AI, part of the Databricks Data Intelligence Platform. This platform unifies data, model training, and production environments into a single solution, allowing for the secure use of enterprise data to augment, fine-tune, or build custom machine learning and generative AI models. Databricks offers a more specialized environment tailored for ML and AI development, making it the preferred platform for data scientists and teams working on cutting-edge AI projects.

Azure Synapse Analytics also offers AI and ML capabilities, particularly through its integration with Azure AI services. It allows the enrichment of data with AI in Synapse Analytics using pretrained models from Azure AI services. The platform supports a variety of AI tasks, such as sentiment analysis, anomaly detection, and cognitive services, directly within Synapse notebooks. However, Azure Synapse’s AI and ML functionalities are more about leveraging existing Azure services rather than providing a deeply integrated, customizable ML environment.

Best for Security: Azure Synapse

This is highly dependent on use case; however, for enterprise users, Synapse is the winner. Azure Synapse implements a multilayered security architecture, ensuring end-to-end protection of data. Key security features include data protection with encryption at rest and in motion, comprehensive access control, authentication to verify user and application identities, network security with private endpoints and virtual networks, and advanced threat protection.

This extensive security framework, combined with Azure’s enterprise-grade compliance, makes it quite hard to overlook Azure Synapse as the superior choice for organizations with stringent security and privacy requirements.

Databricks also emphasizes security, offering features like Databricks Runtime for Machine Learning with built-in security for ML workflows, collaborative notebooks with role-based access control, and integration with enterprise security systems. However, Azure Synapse’s deep integration with the broader Azure security and compliance ecosystem, along with its detailed security layers, provides a more holistic security approach.

Best for Integration: Azure Synapse

Azure Synapse offers a wide range of integrations with third-party data integration solutions, supporting a wide corporate ecosystem that includes both Azure and on-premises data sources, as well as legacy systems. This extensive integration capability is facilitated by partnerships with numerous third-party providers such as Ab Initio, Aecorsoft, Alooma, and Alteryx, among others.

Databricks also provides robust integration options, particularly through its Partner Connect hub, which simplifies the integration process with Databricks clusters and SQL warehouses. Databricks supports a variety of data formats like CSV, Delta Lake, JSON, and Parquet, and connects with major data storage providers such as Amazon S3, Google BigQuery, and Snowflake. Additionally, Databricks Repos offers repository-level integration with Git providers, enhancing the development workflow within Databricks notebooks.

However, Azure Synapse’s broader range of data integration partnerships, combined with its native integration within the Azure ecosystem, offers a more extensive solution for organizations seeking to consolidate and analyze data from a wide array of sources.

Who Shouldn’t Use Azure Synapse or Databricks

Even as robust and extensively featured as these two platforms are, it’s impossible to meet all the needs of all kinds of data professionals.

Who Shouldn’t Use Azure Synapse

Azure Synapse, with its expansive data analytics capabilities and integration within the Azure ecosystem, might not be the best fit for small businesses or startups that have limited data analytics requirements or budget constraints. The platform’s complexity and the need for a certain level of technical expertise to navigate its extensive features can and will frustrate organizations that don’t have a dedicated data team.

Additionally, companies not already using Azure services might struggle to integrate Synapse into their existing workflows, making it less ideal for those outside the Azure ecosystem.

Who Shouldn’t Use Databricks

Databricks is tailored for data science and engineering projects. As a result, it can be overwhelming for nontechnical users or those new to data analytics. Its reliance on Apache Spark and emphasis on machine learning and artificial intelligence might not align with the needs of projects that require straightforward data processing or analytics solutions.

Moreover, the cost associated with Databricks’ advanced capabilities, especially for large-scale data processing, might not be justified for organizations with simpler data analytics needs or limited financial resources.

Best Alternatives to Azure Synapse & Databricks

Google Cloud BigQuery icon.

Google Cloud BigQuery

BigQuery, Google’s fully managed enterprise data warehouse, excels in managing and analyzing data with features like machine learning and geospatial analysis. Its serverless architecture allows for SQL queries to answer complex organizational questions without infrastructure management.

BigQuery’s separation of compute and storage layers enables dynamic resource allocation, enhancing performance and scalability. It’s great for teams that want a powerful analytics tool with fast query execution and extensive data integration capabilities.

Snowflake icon.

Snowflake

Snowflake’s cloud data platform is known for its unique architecture that separates compute from storage, allowing for independent scaling and a pay-as-you-go model. It supports standard and extended SQL, transactions, and advanced features like materialized views and lateral views.

Snowflake’s approach to data encryption, object-level access control, and support for PHI data underlines its commitment to security and compliance. It gives organizations a flexible, scalable solution with strong security features.

Teradata icon.

Teradata Vantage

Teradata Vantage offers a connected multicloud data platform for enterprise analytics, solving complex data challenges efficiently. Vantage is known for its high-performance analytics, comprehensive data integration, and advanced AI and machine learning capabilities, great for enterprises that want reliable analytics across diverse data sets and cloud environments.

Review Methodology: Azure Synapse vs. Databricks

We compared Azure vs. Databricks based on their cost, capabilities, integrations, approach to AI and ML, and user experience.

  • Pricing: We evaluated the cost structures of both platforms, considering the transparency and predictability of pricing models, the availability of free trials or versions, and the overall value for money.
  • Core features: We examined the capabilities of the two to determine what each is good at. For Azure Synapse, we focused on its data integration, analytics, and management capabilities, while for Databricks, we looked at its collaborative environment, performance optimization, and support for machine learning and AI workflows.
  • AI and ML capabilities: We assessed each platform’s strengths in supporting AI and ML projects, such as the availability of built-in models and integration with external AI services.
  • User experience: The ease of use, interface design, and ease of setting up are some of the factors we analyzed here to determine which platform provides a more user-friendly experience.
  • Integration: We looked at each platform’s ability to integrate with other tools and services, including data sources, BI tools, and other cloud services.

FAQs: Azure Synapse vs. Databricks

What is the difference between Azure Synapse & Databricks?

Azure Synapse integrates data warehousing and big data analytics within the Azure ecosystem, offering a unified analytics workspace. Databricks, based on Apache Spark, focuses on collaborative data science and machine learning, supporting a wide range of data analytics workflows.

How do Azure Synapse & Databricks handle big data processing & analytics differently?

Azure Synapse uses a massively parallel processing architecture ideal for enterprise data warehousing, while Databricks leverages Spark’s in-memory processing for real-time analytics and AI-driven projects, making it suitable for data science tasks.

Are there any specific use cases where Azure Synapse excels over Databricks, & vice versa?

Synapse is preferred for traditional data warehousing and integration within the Azure platform, making it a more fitting choice for businesses that need large-scale data management. On the other hand, Databricks excels in data science and machine learning projects, which make it a better consideration for a more flexible environment for collaborative analytics.

Bottom Line: Azure Synapse vs. Databricks

Azure Synapse and Databricks each cater to different aspects of data analytics and management. Synapse is ideal for enterprises deeply integrated with Microsoft Azure that need robust data warehousing solutions and is more suited for data analysis and for users familiar with SQL.

Databricks is better suited for data science teams requiring a collaborative environment with strong machine learning and AI capabilities and is better suited than Synapse for a technical audience. Ultimately, choosing between the two is based on platform preference, an organization’s use case, existing infrastructure, and the financial resources of an organization.

For a deeper understanding of the data analytics market, see our guide: Best Data Analytics Tools 

The post Azure Synapse vs. Databricks: Data Platform Comparison 2024 appeared first on eWEEK.

]]>
Report: Digital Trust Boosts Productivity and Revenue https://www.eweek.com/cloud/digital-trust-key-to-productivity-and-revenue-growth/ Fri, 22 Mar 2024 22:02:23 +0000 https://www.eweek.com/?p=224273 A recent survey conducted by DigiCert provides insights into the state of digital trust among global enterprises. Effective digital trust management ensures the security, privacy, and reliability of digital processes, systems, and interactions. Establishing and maintaining digital trust has become a significant differentiator for organizational success. The survey targeted 300 senior IT, information security, and […]

The post Report: Digital Trust Boosts Productivity and Revenue appeared first on eWEEK.

]]>
A recent survey conducted by DigiCert provides insights into the state of digital trust among global enterprises. Effective digital trust management ensures the security, privacy, and reliability of digital processes, systems, and interactions. Establishing and maintaining digital trust has become a significant differentiator for organizational success.

The survey targeted 300 senior IT, information security, and software development and operations (DevOps) managers working in organizations with more than 1,000 employees across North America, Europe, and Asia-Pacific. The survey findings were published in the 2024 State of Digital Trust report, highlighting a stark contrast between top-performing companies (leaders) and lower-performing ones (laggards).

Higher Revenue and Increased Employee Productivity

The survey indicated that digital trust leaders, representing the top 33 percent of the respondents, have higher revenue, digital innovation, and increased employee productivity.

These leaders excel in responding to outages and incidents, show readiness for post-quantum computing, and effectively utilize the Internet of Things (IoT). They demonstrate a mature approach to administering digital trust through centralized certificate management and the use of email authentication and encryption (S/MIME) technology.

Conversely, the bottom 33 percent—the laggards—struggle in these areas, facing challenges in leveraging digital innovation and maintaining robust digital infrastructure and security practices. Notably, while leaders experienced few system outages, data breaches, and compliance issues, half of the laggards reported problems with IoT standards compliance, and many suffered from software trust mishaps.

Only one in 100 companies surveyed claimed to have highly developed digital trust practices, indicating a common problem in maintaining enterprise digital trust. Furthermore, 98 percent of reported outages and brownouts were attributed to digital trust issues like expired certificates or domain name system (DNS) problems. None of the respondents were confident in their ability to react promptly to such incidents.

For more information about how digital transformation is driving progress, see our coverage: Digital Transformation Guide

The Challenge of Quantum Computing

The looming growth of quantum computing adds another layer of complexity. Quantum capabilities are rapidly accelerating, driven by tech advancements like generative artificial intelligence.

The report uncovers a gap in preparedness for quantum-resistant technologies and the need for strategic action in the face of this evolving threat. According to the data, 61 percent of organizations find themselves underprepared for the post-quantum transition.

Leaders estimate a two-year timeframe to fully respond to the quantum shift, whereas laggards project three years or more. This discrepancy highlights the urgency of developing actionable plans, especially given the current five-year window before quantum computing becomes a more pressing concern. Therefore, immediate and strategic action is necessary in the face of uncertainty, said Brian Trzupek, Senior VP of Product at DigiCert.

“This quantum thing is a big deal. People are starting to get the visibility that this is a looming challenge. It’s greater than just a digital certificate replacement because it’s fundamentally the algorithm that has been attacked. All the libraries for the dependent client software, for the web servers, for the app servers, for the databases—all those things will need updates, including a certificate, to make that work,” said Trzupek.

Concerns Around SSH Protocols

Survey respondents are concerned on the reliance on the secure shell (SSH) protocol based on the Rivest-Shamir-Adleman (RSA) public-key encryption, which is used ubiquitously across cloud services for secure communication and authentication.

Additionally, hardware implementations of RSA, such as secure sockets layer (SSL) offloading and accelerators, present a significant challenge. Trzupek shared an example of one cloud provider that reported having 200,000 such devices, all potentially rendered obsolete by the shift to quantum-resistant algorithms.

Another surprising finding is that 87 percent of the respondents reported that their IoT devices transmitted personally identifiable information (PII) over unencrypted channels. This security loophole in IoT devices poses a threat to user privacy. Fortunately, businesses are now recognizing the significance of upgrading their digital infrastructure to protect users.

Issues in Software Trust

There are major developments happening in the realm of software trust, mainly in implementing software bills of materials (SBOMs) or detailed inventories of software components.

In the previous report, approximately three percent of organizations were aware of or working on SBOMs. In this report, the number has increased monumentally to 99 percent. While organizations recognize the importance of SBOMs, the actual deployment and meaningful use of SBOMs may not be as widespread as the numbers suggest.

Electronic signatures (e-signatures) have also emerged as a key area of interest, with a low percentage of respondents saying their e-signature practices are extremely mature. The business teams, such as legal, human resources, and procurement, usually handle them, not the IT department.

Only about one in eight organizations understand the difference between simple e-signatures and the more secure ones that use certificates. Nearly half (48 percent) use electronic seals on their documents, and most (86 percent) use digital signatures with certificates issued by trusted third parties.

“There are business processes for how you apply those signatures. We see a lot of customers still struggling to make use of cryptographically secure signatures on content like mortgage documents and healthcare documents. They’re definitely looking at making those processes very easy to use. From our survey here, you can see that that’s something they’re still trying to work on,” said Trzupek.

Bottom Line: How to Enhance Digital Trust

To enhance digital trust, DigiCert recommends that organizations thoroughly inventory their digital assets, define clear policies, centralize public key infrastructure (PKI) management, and prioritize their efforts based on business impact.

This can help mitigate security issues, build confidence among customers and partners, and improve operations. Effective digital trust management enables organizations to navigate regulatory challenges, ensuring compliance while protecting sensitive data and adapting to cyber threats.

To learn about the companies leading the way in digital transformation, see our guide: Top Digital Transformation Companies

The post Report: Digital Trust Boosts Productivity and Revenue appeared first on eWEEK.

]]>
Cognos vs. Power BI: 2024 Data Platform Comparison https://www.eweek.com/cloud/cognos-vs-power-bi/ Sat, 16 Dec 2023 16:06:42 +0000 https://www.eweek.com/?p=220545 IBM Cognos Analytics and Microsoft Power BI are two of the top business intelligence (BI) and data analytics software options on the market today. Both of these application and service suites are in heavy demand, as organizations seek to harness real-time repositories of big data for various enterprise use cases, including artificial intelligence and machine […]

The post Cognos vs. Power BI: 2024 Data Platform Comparison appeared first on eWEEK.

]]>
IBM Cognos Analytics and Microsoft Power BI are two of the top business intelligence (BI) and data analytics software options on the market today.

Both of these application and service suites are in heavy demand, as organizations seek to harness real-time repositories of big data for various enterprise use cases, including artificial intelligence and machine learning model development and deployment.

When choosing between two of the most highly regarded data platforms on the market, users often have difficulty differentiating between Cognos and Power BI and weighing each of the platform’s pros and cons. In this in-depth comparison guide, we’ll compare these two platforms across a variety of qualities and variables to assess where their strengths lie.

But first, here’s a glance at the areas where each tool excels most:

  • Cognos Analytics: Best for advanced data analytics and on-premises deployment. Compared to Power BI, Cognos is particularly effective for advanced enterprise data analytics use cases that require more administrative controls over security and governance. Additionally, it is more reliable when it comes to processing large quantities of data quickly and accurately.
  • Power BI: Best for affordable, easy-to-use, integrable BI technology in the cloud. Compared to Cognos Analytics, Power BI is much more versatile and will fit into the budget, skill sets, and other requirements of a wider range of teams. Most significant, this platform offers free access versions that are great for teams that are just getting started with this type of technology.

Cognos vs. Power BI at a Glance

Core Features Ease of Use and Implementation Advanced Analytics Capabilities Cloud vs. On-Prem Integrations Pricing
Cognos Dependent on Use Case Better for On-Prem Dependent on Use Case
Power BI Dependent on Use Case Better for Cloud Dependent on Use Case

What Is Cognos?

An example of an interactive dashboard built in Cognos Analytics.
An example of an interactive dashboard built in Cognos Analytics. Source: IBM

Cognos Analytics is a business intelligence suite of solutions from IBM that combines AI-driven assistance, advanced reporting and analytics, and other tools to support various enterprise data management requirements. The platform is available both in the cloud and on demand for on-premises and custom enterprise network configurations.

With its range of features, Cognos enables users to connect, verify, and combine data and offers plenty of dashboard and visualization options. Cognos is particularly good at pulling and analyzing corporate data, providing detailed reports, and assisting in corporate governance. It is built on a strong data science foundation and is supported by heavy-duty analytics and recommendations, courtesy of IBM Watson.

Also see: Top Business Intelligence Software

Key Features of Cognos

AI assistance interface of IBM Cognos.
Powered by the latest version of Watson, Cognos Analytics offers AI assistance that all users can access through natural language queries. Source: IBM
  • AI-driven insights: The platform benefits from veteran AI support in the form of Watson, which helps with data visualization design, dashboard builds, forecasting, and data explainability. This is particularly helpful for users with limited data science and coding experience who need to pull in-depth analyses from complex datasets.
  • Data democratization through natural language: Advanced natural language capabilities make it possible for citizen data scientists and less-experienced tech professionals to create accurate and detailed data visualizations.
  • Advanced reporting and dashboarding: Multi-user reports and dashboards, personalized report generation, AI-powered dashboard design, and easy shareability make this a great platform for organizations that require different levels of data visibility and granularity for different stakeholders.
  • Automation and governance: Extensive automation and governance capabilities help power users scale their operations without compromising data security. The platform’s robust governance and security features are important to highly regulated businesses and large enterprises in particular.

Pros

  • The platform is well integrated with other business tools, like Slack and various email inboxes, making it easier to collaborate and share insights across a team.
  • Its AI assistant works well for a variety of data analytics and management tasks, even for users with no data science experience, because of its natural language interface.
  • Cognos comes with flexible deployment options, including on-demand cloud, hosted cloud, and client hosting for either on-premises or IaaS infrastructure.

Cons

  • The platform is not particularly mobile-friendly compared to similar competitors.
  • While a range of visuals are available on the platform, many user reviews indicate that the platform’s visuals are limited and not very customizable.
  • Depending on your exact requirements, Cognos Analytics can become quite expensive, especially if you have a high user count or require more advanced features like security and user management.

What Is Power BI?

An example setup for a Microsoft Power BI dashboard.
An example setup for a Microsoft Power BI dashboard. Source: Microsoft

Microsoft Power BI is a business intelligence and data visualization software solution that acts as one part of the Microsoft Power Platform. Because of its unification with other Power Platform products like Power Automate, Power Apps, and Power Pages, this BI tool gives users diverse low-code and AI-driven operations for more streamlined data analytics and management. Additional integrations with the likes of Microsoft 365, Teams, Azure, and SharePoint are a major selling point, as many business users are already highly invested in these business applications and are familiar with the Microsoft approach to UX/UI.

Specific to analytics functions, Power BI focuses most heavily on data preparation, data discovery, dashboards, and data visualization. Its core features enable users to take visualizations to the next level and empower them to make data-driven decisions, collaborate on reports, and share insights across popular applications. They can also create and modify data reports and dashboards easily and share them securely across applications.

Key Features of Power BI

Power BI integration visualization.
Power BI seamlessly integrates with Microsoft’s ERP and CRM software, Dynamics 365, and makes it easier for users to analyze sales data with visualization templates. Source: Microsoft.
  • Rapidly expanding AI analytics: AI-powered data analysis and report creation have already been established in this platform, but recently, the generative AI Copilot tool has also come into preview for Power BI. This expands the platform’s ability to create reports more quickly, summarize and explain data in real time, and generate DAX calculations.
  • CRM integration: Power BI integrates relatively well with Microsoft Dynamics CRM, which makes it a great option for in-depth marketing and sales analytics tasks. Many similar data platforms do not offer such smooth CRM integration capabilities.
  • Embedded and integrated analytics: The platform is available in many different formats, including as an embedded analytics product. This makes it possible for users of other Microsoft products to easily incorporate advanced analytics into their other most-used Microsoft products. You can also embed detailed reports in other apps for key stakeholders who need information in a digestible format.
  • Comprehensive visualizations: Adjustable dashboards, AI-generated and templated reports, and a variety of self-service features enable users to set up visuals that can be alphanumeric, graphical, or even include geographic regions and maps. Power BI’s many native visualization options mean users won’t have to spend too much time trying to custom-fit their dashboards and reports to their company’s specific needs.

Pros

  • Power BI is one of the more mobile-friendly data platforms on the market today.
  • In addition to its user-friendly and easy-to-learn interface, Microsoft offers a range of learning resources and is praised for its customer support.
  • Its AI-powered capabilities continue to grow, especially through the company’s close partnership with OpenAI.

Cons

  • Some users have commented on the tool’s outdated interface and how data updates, especially for large amounts of data, can be slow and buggy.
  • The platform, especially the Desktop tool, uses a lot of processing power, which can occasionally lead to slower load times and platform crashes.
  • Shareability and collaboration features are incredibly limited outside of its highest paid plan tier.

Best for Core Features: It Depends

It’s a toss-up when it comes to the core features Cognos Analytics and Power BI bring to the table.

Microsoft Power BI’s core features include a capable mobile interface, AI-powered analytics, democratized report-building tools and templates, and intuitive integrations with other Microsoft products.

IBM Cognos Analytics’ core features include a web-based report authoring tool, natural-language and AI-powered analytics, customizable dashboards, and security and access management capabilities. Both tools offer a variety of core features that work to balance robustness and accessibility for analytics tasks.

To truly differentiate itself, Microsoft consistently releases updates to its cloud-based services, with notable updates and feature additions over the past couple of years including AI-infused experiences, smart narratives (NLG), and anomaly detection capabilities. Additionally, a Power BI Premium version enables multi-geography capabilities and the ability to deploy capacity to one of dozens of data centers worldwide.

On the other hand, IBM has done extensive work to update the Cognos home screen, simplifying the user experience and giving it a more modern look and feel. Onboarding for new users has been streamlined with video tutorials and accelerator content organized in an easy-to-consume format. Additionally, improved search capabilities and enhancements to the Cognos AI Assistant and Watson features help generate dashboards automatically, recommend the best visualizations, and suggest questions to ask — via natural language query — to dive deeper into data exploration.

Taking these core capabilities and recent additions into account, which product wins on core features? Well, it depends on the user’s needs. For most users, Power BI is a stronger option for general cloud and mobility features, while Cognos takes the lead on advanced reporting, data governance, and security.

Also see: Top Dashboard Software & Tools

Best for Ease of Use and Implementation: Power BI

Although it’s close, new users of these tools seem to find Power BI a little easier to use and set up than Cognos Analytics.

As the complexity of your requirements rises, though, the Power BI platform grows more difficult to navigate. Users who are familiar with Microsoft tools will be in the best position to use the platform seamlessly, as they can take advantage of skills from applications they already use, such as Microsoft Excel, to move from building to analyzing to presenting with less data preparation. Further, all Power BI users have access to plenty of free learning opportunities that enable them to rapidly start building reports and dashboards.

Cognos, on the other hand, has a more challenging learning curve, but IBM has been working on this, particularly with recent user interface updates, guided UI for dashboard builds, and assistive AI. The tool’s AI-powered and Watson-backed analytics capabilities in particular lower the barrier of entry to employing advanced data science techniques.

The conclusion: Power BI wins on broad usage by a non-technical audience, whereas IBM has the edge with technical users and continues to improve its stance with less-technical users. Overall, Power BI wins in this category due to generally more favorable user reviews and commentary about ease of use.

Also see: Top AI Software

Best for Advanced Analytics Capabilities: Cognos

Cognos Analytics surpasses Power BI for its variety of in-depth and advanced analytics operations.

Cognos integrates nicely with other IBM solutions, like the IBM Cloud Pak for Data platform, which extends the tool’s already robust data analysis and management features. It also brings together a multitude of data sources as well as an AI Assistant tool that can communicate in plain English, sharing fast recommendations that are easy to understand and implement. Additionally, the platform generates an extensive collection of visualizations. This includes geospatial mapping and dashboards that enable the user to drill down, rise, or move horizontally through visuals that are updated in real time.

Recent updates to Cognos’s analytical capabilities include a display of narrative insights in dashboard visualizations to show meaningful aspects of a chart’s data in natural language, the ability to specify the zoom level for dashboard viewing and horizontal scrolling in visualizations, as well as other visualization improvements.

On the modeling side of Cognos, data modules can be dynamically redirected to different data server connections, schemas, or catalogs at run-time. Further, the Convert and Relink options are available for all types of referenced tables, and better web-based modeling has been added.

However, it’s important to note that Cognos still takes a comparatively rigid, templated approach to visualization, which makes custom configurations difficult or even impossible for certain use cases. Additionally, some users say it takes extensive technical aptitude to do more complex analysis.

Power BI’s strength is out-of-the-box analytics that doesn’t require extensive integration or data science smarts. It regularly adds to its feature set. More recently, it has added new features for embedded analytics that enable users to embed an interactive data exploration and report creation experience in applications such as Dynamics 365 and SharePoint.

For modeling, Microsoft has added two new statistical DAX functions, making it possible to simultaneously filter more than one table in a remote source group. It also offers an Optimize ribbon in Power BI Desktop to streamline the process of authoring reports (especially in DirectQuery mode) and more conveniently launch Performance Analyzer to analyze queries and generate report visuals. And while Copilot is still in preview at this time, this tool shows promise for advancing the platform’s advanced analytics capabilities without negatively impacting its ease of use.

In summary, Power BI is good at crunching and analyzing real-time data and continues to grow its capabilities, but Cognos Analytics maintains its edge, especially because Cognos can conduct far deeper analytics explorations on larger amounts of data without as many reported performance issues.

Also see: Data Analytics Trends

Best for Cloud Users: Power BI; Best for On-Prem Users: Cognos

Both platforms offer cloud and on-premises options for users, but each one has a clear niche: Power BI is most successful on the cloud, while Cognos has its roots in on-prem setups.

Power BI has a fully functional SaaS version running in Azure as well as an on-premises version in the form of Power BI Report Server. Power BI Desktop is also offered for free as a standalone personal analysis tool.

Although Power BI does offer on-prem capabilities, power users who are engaged in complex analysis of multiple on-premises data sources typically still need to download Power BI Desktop in addition to working with Power BI Report Server. The on-premises product is incredibly limited when it comes to dashboards, streaming analytics, natural language, and alerting.

Cognos also offers both cloud and on-premises versions, with on-demand, hosted, and flexible on-premises deployment options that support reporting, dashboarding, visualizations, alters and monitoring, AI, and security and user management, regardless of which deployment you choose. However, Cognos’ DNA is rooted in on-prem, so it lags behind Microsoft on cloud-based bells and whistles.

Therefore, Microsoft gets the nod for cloud analytics, and Cognos for on-prem, but both are capable of operating in either format.

Also see: Top Data Visualization Tools

Best for Integrations: It Depends

Both Cognos Analytics and Power BI offer a range of helpful data storage, SaaS, and operational tool integrations that users find helpful. Ultimately, neither tool wins this category because they each have different strengths here.

Microsoft offers an extensive array of integration options natively, as well as APIs and partnerships that help to make Power BI more extensible. Power BI is tightly embedded into much of the Microsoft ecosystem, which makes it ideally suited for current Azure, Dynamics, Microsoft 365, and other Microsoft customers. However, the company is facing some challenges when it comes to integrations beyond this ecosystem, and some user reviews have reflected frustrations with that challenge.

IBM Cognos connects to a large number of data sources, including spreadsheets. It is well integrated into several parts of the vast IBM portfolio. It integrates nicely, for example, with the IBM Cloud Pak for Data platform and more recently has added integration with Jupyter notebooks. This means users can create and upload notebooks into Cognos Analytics and work with Cognos Analytics data in a notebook using Python scripts. The platform also comes with useful third-party integrations and connectors for tools like Slack, which help to extend the tool’s collaborative usage capabilities.

This category is all about which platform and IT ecosystem you live within, so it’s hard to say which tool offers the best integration options for your needs. Those invested in Microsoft will enjoy tight integration within that sphere if they select Power BI. Similarly, those who are committed to all things IBM will enjoy the many ways IBM’s diverse product and service set fit with Cognos.

Also see: Digital Transformation Guide: Definition, Types & Strategy

Best for Pricing: Power BI

While Cognos Analytics offers some lower-level tool features at a low price point, Power BI offers more comprehensive and affordable entry-level packages to its users.

Microsoft is very good at keeping prices low as a tactic for growing market share. It offers a lot of features at a relatively low price. Power BI Pro, for example, costs approximately $10 per user per month, while the Premium plan is $20 per user per month. Free, somewhat limited versions of the platform are also available via Power BI Desktop and free Power BI accounts in Microsoft Fabric.

The bottom line for any rival is that it is hard to compete with Microsoft Power BI on price, especially because many of its most advanced features — including automated ML capabilities and AI-powered services — are available in affordable plan options.

IBM Cognos Analytics, on the other hand, has a reputation for being expensive. It is hard for IBM to compete with Power BI on price alone.

IBM Cognos Analytics pricing starts at $10 per user per month for on-demand cloud access and $5 per user per month for limited mobile user access to visuals and alerts on the cloud-hosted or client-hosted versions. For users who want more than viewer access and the most basic of capabilities, pricing can be anywhere from $40 to $450 per user per month.

Because of the major differences in what each product offers in its affordable plans, Microsoft wins on pricing.

Also see: Data Mining Techniques

Why Shouldn’t You Use Cognos or Power BI?

While both data and BI platforms offer extensive capabilities and useful features to users, it’s possible that these tools won’t meet your particular needs or align with industry-specific use cases in your field. If any of the following points are true for your business, you may want to consider an alternative to Cognos or Power BI:

Who Shouldn’t Use Cognos

The following types of users and companies should consider alternatives to Cognos Analytics:

  • Users or companies with smaller budgets or who want a straightforward, single pricing package; Cognos tends to have up-charges and add-ons that are only available at an additional cost.
  • Users who require extensive customization capabilities, particularly for data visualizations, dashboards, and data exploration.
  • Users who want a more advanced cloud deployment option.
  • Users who have limited experience with BI and data analytics technology; this tool has a higher learning curve than many of its competitors and limited templates for getting started.
  • Users who are already well established with another vendor ecosystem, like Microsoft or Google.

Who Shouldn’t Use Power BI

The following types of users and companies should consider alternatives to Power BI:

  • Users who prefer to do their work online rather than on a mobile device; certain features are buggy outside of the mobile interface.
  • Users who are not already well acquainted and integrated with the Microsoft ecosystem may face a steep learning curve.
  • Users who prefer to manage their data in data warehouses rather than spreadsheets; while data warehouse and data lake integrations are available, including for Microsoft’s OneLake, many users run into issues with data quality in Excel.
  • Users who prefer a more modern UI that updates in real time.
  • Users who primarily use Macs and Apple products; some users have reported bugs when attempting to use Power BI Desktop on these devices.

Also see: Best Data Analytics Tools

If Cognos or Power BI Isn’t Ideal for You, Check Out These Alternatives

While Cognos and Power BI offer extensive features that will meet the needs of many BI teams and projects, they may not be the best fit for your particular use case. The following alternatives may prove a better fit:

Domo icon.

Domo

Domo puts data to work for everyone so they can extend their data’s impact on the business. Underpinned by a secure data foundation, the platform’s cloud-native data experience makes data visible and actionable with user-friendly dashboards and apps. Domo is highly praised for its ability to help companies optimize critical business processes at scale and quickly.

Yellowfin icon.

Yellowfin

Yellowfin is a leading embedded analytics platform that offers intuitive self-service BI options. It is particularly successful at accelerating data discovery. Additionally, the platform allows anyone, from an experienced data analyst to a non-technical business user, to create reports in a governed way.

Wyn Enterprise icon.

Wyn Enterprise

Wyn Enterprise offers a scalable embedded business intelligence platform without hidden costs. It provides BI reporting, interactive dashboards, alerts and notifications, localization, multitenancy, and white-labeling in a variety of internal and commercial apps. Built for self-service BI, Wyn offers extensive visual data exploration capabilities, creating a data-driven mindset for the everyday user. Wyn’s scalable, server-based licensing model allows room for your business to grow without user fees or limits on data size.

Zoho Analytics icon.

Zoho Analytics

Zoho Analytics is a top BI and data analytics platform that works particularly well for users who want self-service capabilities for data visualizations, reporting, and dashboarding. The platform is designed to work with a wide range of data formats and sources, and most significantly, it is well integrated with a Zoho software suite that includes tools for sales and marketing, HR, security and IT management, project management, and finance.

Sigma Computing icon.

Sigma

Sigma is a cloud-native analytics platform that delivers real-time insights, interactive dashboards, and reports, so you can make data-driven decisions on the fly. With Sigma’s intuitive interface, you don’t need to be a data expert to dive into your data, as no coding or SQL is required to use this tool. Sigma has also recently brought forth Sigma AI features for early access preview.

Review Methodology

The two products in this comparison guide were assessed through a combination of reading product materials on vendor sites, watching demo videos and explanations, reviewing customer reviews across key metrics, and directly comparing each product’s core features through a comparison graph.

Below, you will see four key review categories that we focused on in our research. The percentages used for each of these categories represent the weight of the categorical score for each product.

User experience – 30%

Our review placed a heavy emphasis on user experience, considering both ease of use and implementation as well as the maturity and reliability of product features. We looked for features like AI assistance and low-code/no-code capabilities that lessened the learning curve, as well as learning materials, tutorials, and consistent customer support resources. Additionally, we paid attention to user reviews that commented on the product’s reliability and any issues with bugs, processing times, product crashes, or other performance issues.

Advanced analytics and scalability – 30%

To truly do business intelligence well, especially for modern data analytics requirements, BI tools need to offer advanced capabilities that scale well. For this review, we emphasized AI-driven insights, visuals that are configurable and updated in real time, shareable and collaborative reports and dashboards, and comprehensive features for data preparation, data modeling, and data explainability. As far as scalability goes, we not only looked at the quality of each of these tools but also assessed how well they perform and process data on larger-scale operations. We particularly highlighted any user reviews that mentioned performance lag times or other issues when processing large amounts of data.

Integrations and platform flexibility – 20%

Because these platforms need to be well integrated into a business’s data sources and most-used business applications to be useful, our assessment also paid attention to how integrable and flexible each platform was for different use cases. We considered not only how each tool integrates with other tools from the same vendor but also which data sources, collaboration and communication applications, and other third-party tools are easy to integrate with native integrations and connectors. We also considered the quality of each tool’s APIs and other custom opportunities for integration, configuration, and extensibility.

Affordability – 20%

While affordability is not the be-all-end-all when it comes to BI tools, it’s important to many users that they find a tool that balances an accessible price point with a robust feature set. That’s why we also looked at each tool’s affordability, focusing on entry price points, what key features are and are not included in lower-tier pricing packages, and the jumps in pricing that occur as you switch from tier to tier. We also considered the cost of any additional add-ons that users might need, as well as the potential cost of partnering with a third-party expert to implement the software successfully.

Bottom Line: Cognos vs. Power BI

Microsoft is committed to investing heavily in Power BI and enhancing its integrations across other Microsoft platforms and a growing number of third-party solutions. Any organization that is a heavy user of Office 365, Teams, Dynamics, and/or Azure will find it hard to resist the advantages of deploying Power BI.

And those advantages are only going to increase. On the AI front, for example, the company boasts around 100,000 customers using Power BI’s AI services. It is also putting effort into expanding its AI capabilities, with the generative AI-driven Copilot now in preview for Power BI users. For users with an eye on their budget who don’t want to compromise on advanced analytics and BI features, Power BI is an excellent option.

But IBM isn’t called Big Blue for nothing. It boasts a massive sales and services team and global reach into large enterprise markets. It has also vastly expanded its platform’s AI capabilities, making it a strong tool for democratized data analytics and advanced analytics tasks across the board.

Where Cognos Analytics has its most distinct advantage is at the high end of the market. Microsoft offers most of the features that small, midsize, and larger enterprises need for analytics. However, at the very high end of the analytics market, and in corporate environments with hefty governance and reporting requirements or legacy and on-premises tooling, Cognos has carved out a strategic niche that it serves well.

Ultimately, either tool could work for your organization, depending on your budget, requirements, and previous BI tooling experience. The most important step you can take is to speak directly with representatives from each of these vendors, demo these tools, and determine which product includes the most advantageous capabilities for your team.

Read next: 10 Best Machine Learning Platforms

The post Cognos vs. Power BI: 2024 Data Platform Comparison appeared first on eWEEK.

]]>
Snowflake vs. Databricks: Comparing Cloud Data Platforms https://www.eweek.com/big-data-and-analytics/snowflake-vs-databricks/ Tue, 31 Oct 2023 15:30:31 +0000 https://www.eweek.com/?p=221049 Drawing a comparison between top data platforms Snowflake and Databricks is crucial for today’s businesses because data analytics and data management are now deeply essential to their operations and opportunities for growth. Which data platform is best for your business? In short, Snowflake is more suited for standard data transformation and analysis and for those […]

The post Snowflake vs. Databricks: Comparing Cloud Data Platforms appeared first on eWEEK.

]]>
Drawing a comparison between top data platforms Snowflake and Databricks is crucial for today’s businesses because data analytics and data management are now deeply essential to their operations and opportunities for growth. Which data platform is best for your business?

In short, Snowflake is more suited for standard data transformation and analysis and for those users familiar with SQL. Databricks is geared for streaming, ML, AI, and data science workloads courtesy of its Spark engine, which enables the use of multiple development languages.

Both Snowflake and Databricks provide the volume, speed, and quality demanded by business intelligence applications. But there are as many similarities as there are differences. When examined closely, it becomes clear that these two cloud-based data platforms have a different orientation. Therefore, selection often boils down to tool preference and suitability for the organization’s data strategy.

What Is Snowflake?

Snowflake is a major cloud company that focuses on data-as-a-service features and functions for big data operations. Its core platform is designed to seamlessly integrate data from various business apps and in different formats in a unified data store. Consequently, typical extract, transform, and load (ETL) operations may not be necessary to get the data integration results you need.

The platform is compatible with various types of business workloads, including artificial intelligence and machine learning, data lakes and data warehouses, and cybersecurity workloads. It is ideally designed for organizations that are working with large quantities of data that require precise data governance and management systems in place.

What Is Databricks?

Databricks is a data-driven vendor with products and services that focus on data lake and warehouse development as well as AI-driven analytics and automation. Its flagship lakehouse platform includes unified analytics and AI management features, data sharing and governance capabilities, AI and machine learning, and data warehousing and engineering.

Users can access certain platform features through an open-source format, making this a highly extensible and customizable solution for developers. It’s also a popular solution for users who want to incorporate other AI or IDE integrations into their setup.

Snowflake vs. Databricks: Comparing Key Features

We’ll compare these two data companies in greater detail in the sections to come, but for a quick scan, we’ve developed this table to compare Snowflake vs. Databricks across a few key metrics and categories:

  Support and Ease of Use Security Integrations AI Features Pricing
Snowflake Tied     Dependent on Use Case
Databricks   Tied Dependent on Use Case

Snowflake is a relational database management system and analytics data warehouse for structured and semi-structured data.

Offered via the software-as-a-service (SaaS) model, Snowflake uses an SQL database engine to manage how information is stored in the database. It can process queries against virtual warehouses within the overall warehouse, each one in its own cluster node independent of others so as not to share compute resources.

Sitting on top of that database engine are cloud services for authentication, infrastructure management, queries, and access controls. The Snowflake Elastic Data Warehouse enables users to analyze and store data utilizing Amazon S3 or Azure resources.

Databricks is also cloud-based but is based on Apache Spark. Its management layer is built around Apache Spark’s distributed computing framework to make infrastructure management easier. Databricks positions itself as a data lake rather than a data warehouse. Thus, the emphasis is more on use cases such as streaming, machine learning, and data science-based analytics.

Databricks can be used to handle raw unprocessed data in large volumes. Databricks is delivered as SaaS and can run on AWS, Azure, and Google Cloud. There is a data plane as well as a control plane for backend services that delivers instant compute. Its query engine is said to offer high performance via a caching layer. Snowflake includes a storage layer while Databricks provides storage by running on top of AWS S3, Azure Blob Storage, and Google Cloud Storage.

For those wanting a top-class data warehouse, Snowflake wins. But for those needing more robust ELT, data science, and machine learning features, Databricks is the winner.

Snowflake vs. Databricks: Support and Ease of Use Comparison

The Snowflake data warehouse is said to be user-friendly, with an intuitive SQL interface that makes it easy to get set up and running. It also has plenty of automation features to facilitate ease of use. Auto-scaling and auto-suspend, for example, help in stopping and starting clusters during idle or peak periods. Clusters can be resized easily.

Databricks, too, has auto-scaling for clusters. The UI is more complex for more arbitrary clusters and tools, but the Databricks SQL Warehouse uses a straightforward “t-shirt sizing approach” for clusters that makes it a user-friendly solution as well. 

Both tools emphasize ease of use in certain capacities, but Databricks is intended for a more technical audience, so certain steps like updating configurations and switching options may involve a steeper learning curve.

Both Snowflake and Databricks offer online, 24/7 support, and both have received high praise from customers in this area.

Though both are top players in this category, Snowflake wins for its wider range of user-friendly and democratized features.

Also see: Top Business Intelligence Software

Snowflake vs. Databricks: Security Comparison

Snowflake and Databricks both provide role-based access control (RBAC) and automatic encryption. Snowflake adds network isolation and other robust security features in tiers with each higher tier costing more. But on the plus side, you don’t end up paying for security features you don’t need or want.

Databricks, too, includes plenty of valuable security features. Both data vendors comply with SOC 2 Type II, ISO 27001, HIPAA, GDPR, and more.

No clear winner in this category.

Snowflake vs. Databricks: Integrations Comparison

Snowflake is on the AWS Marketplace but is not deeply embedded within the AWS ecosystem. In some cases, it can be challenging to pair Snowflake with other tools. But in other cases, Snowflake is wonderfully integrated. Apache Spark, IBM Cognos, Tableau, and Qlik are all fully integrated. Those using these tools will find analysis easy to accomplish.

Both tools support semi-structured and structured data. Databricks has more versatility in terms of supporting any format of data, including unstructured data. Snowflake is adding support for unstructured data now, too.

Databricks wins this category.

Also see: Top Data Mining Tools 

Snowflake vs. Databricks: AI Features Comparison

Both Snowflake and Databricks include a range of AI and AI-supported features in their portfolio, and the number only seems to grow as both vendors adopt generative AI and other advanced AI and ML capabilities.

Snowflake supports a range of AI and ML workloads, and in more recent years has added the following two AI-driven solutions to its portfolio: Snowpark and Streamlit. Snowpark offers users several libraries, runtimes, and APIs that are useful for ML and AI training as well as MLOps. Streamlit, now in public preview, can be used to build a variety of model types — including ML models — with Snowflake data and Python development best practices.

Databricks, on the other hand, has more heavily intertwined AI in all of its products and services and for a longer time. The platform includes highly accessible machine learning runtime clusters and frameworks, autoML for code generation, MLflow and a managed version of MLflow, model performance monitoring and AI governance, and tools to develop and manage generative AI and large language models.

While both vendors are making major strides in AI, Databricks takes the win here.

Snowflake vs. Databricks: Price Comparison

There is a great deal of difference in how these tools are priced. But speaking very generally: Databricks is priced at around $99 a month. There is also a free version. Snowflake works out at about $40 a month, though it isn’t as simple as that.

Snowflake keeps compute and storage separate in its pricing structure. And its pricing is complex with five different editions from basic up, and prices rise as you move up the tiers. Pricing will vary tremendously depending on the workload and the tier involved.

As storage is not included in its pricing, Databricks may work out cheaper for some users. It all depends on the way the storage is used and the frequency of use. Compute pricing for Databricks is also tiered and charged per unit of processing. The differences between them make it difficult to do a full apples-to-apples comparison. Users are advised to assess the resources they expect to need to support their forecast data volume, amount of processing, and their analysis requirements. For some users, Databricks will be cheaper, but for others, Snowflake will come out ahead.

This is a close one as it varies from use case to use case.

Also see: Real-Time Data Management Trends

Snowflake and Databricks Alternatives

Bottom Line: Snowflake vs. Databricks

Snowflake and Databricks are both excellent data platforms for data analysis purposes. Each has its pros and cons. Choosing the best platform for your business comes down to usage patterns, data volumes, workloads, and data strategies.

Snowflake is more suited for standard data transformation and analysis and for those users familiar with SQL. Databricks is more suited to streaming, ML, AI, and data science workloads courtesy of its Spark engine, which enables the use of multiple development languages. Snowflake has been playing catchup on languages and recently added support for Python, Java, and Scala.

Some say Snowflake is better for interactive queries as it optimizes storage at the time of ingestion. It also excels at handling BI workloads, and the production of reports and dashboards. As a data warehouse, it offers good performance. Some users note, though, that it struggles when faced with huge data volumes as would be found with streaming workloads. In a straight competition on data warehousing capabilities, Snowflake wins.

But Databricks isn’t really a data warehouse at all. Its data platform is wider in scope with better capabilities than Snowflake for ELT, data science, and machine learning. Users store data in managed object storage of their choice. It focuses on the data lake and data processing. But it is squarely aimed at data scientists and professional data analysts.

In summary, Databricks wins for a technical audience. Snowflake is highly accessible to a technical and less technical user base. Databricks provides pretty much every data management feature offered by Snowflake and a lot more. But it isn’t quite as easy to use, has a steeper learning curve, and requires more maintenance. Regardless though, Databricks can address a much wider set of data workloads and languages, and those familiar with Apache Spark will tend to gravitate toward Databricks.

Snowflake is better set up for users who want to deploy a good data warehouse and analytics tool rapidly without bogging down in configurations, data science minutia, or manual setup. But this isn’t to say that Snowflake is a light tool or for beginners. Far from it. 

But it isn’t high-end like Databricks, which is aimed more at complex data engineering, ETL, data science, and streaming workloads. Snowflake, in contrast, is a warehouse to store production data for analytics purposes. It is accessible for beginners, too, and for those who want to start small and scale up gradually.

Pricing comes into the selection picture, of course. Sometimes Databricks will be much cheaper due to the way it allows users to take care of their own storage. But not always. Sometimes Snowflake will pan out cheaper.

The post Snowflake vs. Databricks: Comparing Cloud Data Platforms appeared first on eWEEK.

]]>
Jitterbit CEO George Gallegos on Tech Integration in Enterprise Infrastructure https://www.eweek.com/it-management/jitterbit-tech-integration-in-enterprise-infrastructure/ Thu, 12 Oct 2023 23:03:34 +0000 https://www.eweek.com/?p=223194 I spoke with George Gallegos, CEO at Jitterbit, about how automation and integration technology allow the many disparate aspects of enterprise IT to function in tandem. Among the topics we discussed:  Let’s talk about integration technology in the enterprise. How does it work in terms of, say, integrating cloud and legacy in-house apps? What are […]

The post Jitterbit CEO George Gallegos on Tech Integration in Enterprise Infrastructure appeared first on eWEEK.

]]>
I spoke with George Gallegos, CEO at Jitterbit, about how automation and integration technology allow the many disparate aspects of enterprise IT to function in tandem.

Among the topics we discussed: 

  • Let’s talk about integration technology in the enterprise. How does it work in terms of, say, integrating cloud and legacy in-house apps?
  • What are the challenges in integration? The typical headaches? How do you recommend companies handle these challenges?
  • How is Jitterbit addressing the integration needs of its clients?
  • The future of tech integration in the enterprise? Will it ever get easy?

Listen to podcast:

Also available on Apple Podcasts

Watch the video:

The post Jitterbit CEO George Gallegos on Tech Integration in Enterprise Infrastructure appeared first on eWEEK.

]]>
Modernizing the Mainframe—IBM Introduces Watsonx Code Assistant for Z https://www.eweek.com/it-management/modernizing-the-mainframe-ibm-introduces-watsonx-code-assistant-for-z/ Mon, 09 Oct 2023 17:43:06 +0000 https://www.eweek.com/?p=223118 IBM has introduced watsonx Code Assistant for Z, an AI-powered tool for mainframe modernization, offering developers insight into how code will work.

The post Modernizing the Mainframe—IBM Introduces Watsonx Code Assistant for Z appeared first on eWEEK.

]]>
“Modernization” and “legacy” are two of the most used and abused terms in the tech industry.

How so? On the upside, they accurately, if simplistically, describe the technical and market dynamics of a forward-focused industry that is quick to develop innovations and products designed to enhance performance and user experience.

But on the downside, the terms reflect the industry’s longstanding obsession with building, marketing and profiting from new products to the point of claiming, often without evidence, that they are superior to solutions already residing in client data centers.

Most important, they continually enhance existing solutions and platforms to ensure that they remain relevant to the needs of modern enterprises. IBM’s new watsonx Code Assistant for Z is a good example of one such effort.

Modernization vs. Legacy Hype

That “new” doesn’t automatically translate to “better” is a bit of practical wisdom that is seldom, if ever, seen in tech industry ad copy. Instead, vendors tend to hype shiny new things—claiming the innate superiority of this year’s gear over previous generation systems and platforms.

Certainly, new or next gen CPUs, storage media, interconnects and other technologies typically deliver better and/or more efficient performance. However, the value of ripping out existing or older systems and replacing them with new hardware is usually vastly overrated, often resembling a case of “fixing what isn’t broken.” The process is also expensive for customers, sometimes hugely so, due to costs related to system integration, software upgrades and retraining and certifying IT personnel.

In addition, generational shifts can make it increasingly difficult for businesses to find new system administrators, developers and technicians as existing staff members age-out. As is true in most other industries, younger workers typically prefer to explore and use new and emerging technologies.

That is a scenario that IBM plans to mitigate and avoid with its new watsonx Code Assistant for Z.

What is it? According to the company, the new solution is a generative AI-assisted product that is designed to enable faster translation of COBOL to Java on IBM Z, thus saving developers time and enhancing their productivity. It also joins IBM watsonx Code Assistant for Red Hat Ansible Lightspeed (scheduled for release later this year) in the watsonx Code Assistant product family.

Both solutions leverage IBM’s watsonx.ai code model, which the company says will employ knowledge of 115 coding languages learned from 1.5 trillion tokens. According to IBM, at 20 billion parameters, the watsonx.ai code model will be one of the largest generative AI foundation models for computer code automation.

Why is this important? First, because of the sheer pervasiveness of COBOL. Enterprise developers and software engineers have written hundreds of billions of lines of COBOL code. Plus, due to its notable flexibility and reliability, COBOL is still widely used, reportedly supporting some $3 trillion in daily financial transactions. In other words, COBOL is literally “business critical” to tens of thousands of large enterprises, millions of smaller companies and billions of consumers.

Also see: Top Digital Transformation Companies

COBOL Meets Watsonx Code Assistant for Z

Despite its vital position in transaction processing, COBOL is hardly a favorite among young computer professionals. Though COBOL and other mainframe programmers earn premium salaries (according to IBM, some 20-30 percent more than their peers), employers struggle to fill available positions.

That’s where IBM’s watsonx Code Assistant for Z comes in. The company notes that the new solution is designed to make it easier for developers to selectively choose and evolve COBOL business services into well architected, high-quality Java code.

Plus, IBM believes watsonx generative AI can enable developers to quickly assess, update, validate and test the right code, allowing them to efficiently modernize even large scale applications.

The Java on Z code resulting from watsonx Code Assistant for Z will be object-oriented and is designed to be performance-optimized versus comparable x86 platforms. IBM is designing the solution to be interoperable with the rest of the COBOL application family, as well as with CICS, IMS, DB2 and other z/OS runtimes. Lastly, IBM Consulting’s deep domain expertise in IBM Z application modernization makes it a prime resource for clients in key industries such as banking, insurance, healthcare and government.

Final Analysis

Though marketing professionals may feel comfortable with portraying modern and legacy technologies as a simplistic “new vs. old” conundrum, business owners, IT management and knowledgeable staff, including developers, understand the complexities of the modern/legacy dynamic. Rather than age, the larger issue is relevance: why an organization began employing a particular technology and how or whether that solution remains relevant to its owner’s needs.

It is not unlike how people and organizations remain relevant. Industries, companies, markets and larger economies are in a constant state of evolution. People and organizations succeed by adapting to those changes, by learning new skills, exploring new opportunities, and remaining vitally relevant to customers and partners. IBM’s new watsonx Code Assistant for Z demonstrates that what is true for people can also be true for information technologies.

Read next: Digital Transformation Guide

The post Modernizing the Mainframe—IBM Introduces Watsonx Code Assistant for Z appeared first on eWEEK.

]]>
Reshoring Alleviates Supply Chain Issues – But It Needs Tech to Control Costs https://www.eweek.com/it-management/reshoring-alleviates-supply-chain-issues/ Thu, 10 Aug 2023 19:18:28 +0000 https://www.eweek.com/?p=222848 In the post pandemic world of skill shortages, supply chain disruptions, and geopolitical issues, manufacturers are struggling to operate at full capacity. In a bid to tackle these issues, manufacturers and logistic providers have sought solutions nearer to home – they have “reshored” operations. Reshoring’s primary goal is to regain control over the entire end-to-end […]

The post Reshoring Alleviates Supply Chain Issues – But It Needs Tech to Control Costs appeared first on eWEEK.

]]>
In the post pandemic world of skill shortages, supply chain disruptions, and geopolitical issues, manufacturers are struggling to operate at full capacity. In a bid to tackle these issues, manufacturers and logistic providers have sought solutions nearer to home – they have “reshored” operations.

Reshoring’s primary goal is to regain control over the entire end-to-end supply chain—it’s about manufacturing products on local soil, and it’s a process that’s been gaining traction from companies worldwide.

From a North American perspective, the picture is no different. Many U.S. companies have begun the shift away from globalization as default, with research suggesting that nearly 350,000 jobs were re-shored to the U.S. in 2022—a notable increase when compared to the 2021 figure of 260,000.

The movement has also seen companies become less reliant on China. Now, many economies, including the U.S., India, and the European Union, are looking to establish a roadmap that will balance supply chains and increase resiliency. The China Plus One Strategy is an approach adopted by a number of businesses looking to include sourcing from other destinations. Already, numerous companies have turned to Vietnam and India as alternatives, with both countries reporting an uptick in investment from U.S. companies that have built plants there.

According to the Reshoring Initiative IH 2022 Data Report, supply chain gaps, the need for greater self-sufficiency, and a volatile geopolitical climate are major factors driving reshoring. The report found that 69% of companies cited supply chain disruptions as the primary reason for reshoring.

There is now movement on a national level to strengthen supply chains and promote domestic manufacturing with the introduction of the bipartisan National Development Strategy and Coordination Bill in December 2022. This bill highlights the importance of manufacturing reshoring to national economic development going forward into 2023.

Sustainability and Tech in Reshoring

Recent research commissioned by IFS, polling senior decision-makers working for large enterprises globally, found that 72% have increased their usage of domestic suppliers, compared to international suppliers.

From a sustainability perspective, there are huge benefits to be gained. In fact, reshoring is giving manufacturers a golden opportunity to look hard at their manufacturing processing and how they can develop more sustainable processes.

For example, it can minimize CO2 emissions as transport is reduced and spur a deduction in wasteful overproduction as supply chains are brought closer together. As the whole world strives to act more sustainably in the race to net-zero, environmental benefits will play a huge role in driving new sourcing strategies.

However, the raw materials, components, and products that they source from suppliers are likely to become more expensive, especially as inflation continues to gather pace globally. As a result, 53% have considered increasing the proportion of materials/components they produce in-house. But again, these measures and others like them that organizations are now taking to mitigate risk are likely to add cost, complexity, and waste to the supply chain.

Therefore, reshoring is not the silver bullet to mitigating supply chain disruption entirely. Often, companies underestimate the sheer level of effort, costs, and logistical planning required to make reshoring a success.

But for many U.S. companies, the extra costs to manufacture within the country are definitely outweighed by the savings in customs and shipping costs and the additional sustainability benefits associated with offshore operations.

It’s here organizations need the helping hand of technology—in fact, it can be a key facilitator for solving supply chain, labor, and production challenges associated with reshoring.

For 94% of respondents in a recent McKinsey study, Industry 4.0 helped keep operations running during the COVID-19 pandemic, with another 56% claiming Industry 4.0 technologies had been critical for efficient responses.

A new IDC InfoBrief, sponsored by IFS and entitled Shaping the Future of Manufacturing, shows an active correlation between digital maturity and profit. According to the research, manufacturers reporting an optimized level of digital transformation saw profits increase 40%, while those with less advanced digital transformation maturity suffered bigger reductions in profit in the last fiscal year.

Tech has been quick to respond to the call to deliver the agility and fast “Time to Insight” (TTI) that manufacturers need to better forecast demand and provide a more detailed view of sustainability across product supply chains. Exceptional supply chain management will be a vital part of the move to reshoring. The IFS study showed supply chain management was now seen by 37% of respondents as one of the top three priorities their organization is trying to solve through technology investment.

Reshoring in Action: Will the Benefits Be Worth It?

In a recent Kearney index on manufacturing reshoring, 92% of executives expressed positive sentiments toward reshoring. And that’s no surprise when you consider the additional benefits on offer. As well as a more protected supply chain ecosystem, there are also positive societal benefits from the move to reshoring.

According to the U.S. Reshoring Initiative, in 2021 the private and federal push for domestic U.S. supply of essential goods propelled reshoring and foreign direct investment (FDI) job announcements to a record high.

From a broader perspective, there are many profitable and supply chain benefits at stake for manufacturers. For example, research found that 83% of consumers in the U.S. are willing to pay 20% more for American-made products, with another 57% claiming that the origin of a product would sway their purchasing decision.

From a management standpoint, control over operations has significantly increased. Bringing operations all to one centralized location gives businesses tighter control over processes. Manufacturers will also benefit from shorter supply chains as much of today’s manufacturing is spurred by IoT, AI, and machine learning capable of performing monotonous tasks around the clock.

On a day-to-day level, on-site teams will experience increased collaboration as reshoring drastically reduces the time difference between headquarters and the manufacturing plant.

Tech Needs to Drive Reshoring

It’s easy to see why the appeal of reshoring is prompting a move toward U.S.-based manufacturing initiatives. By addressing reshoring now with the right technology, efficiently and cost-effectively, manufacturers will put themselves in a great position to not only survive but also thrive long into the future.

Of course, as with any major transformation, there are hurdles to overcome. But the long-term results of reshoring, from increased employment to tighter manufacturing control, look as though it’s a journey worth embarking on. As more and more companies around the world look to reshore operations on home soil, manufacturers will need the guiding hand of a flexible and agile software platform to make reshoring a reality at scale.

About the Author:

Maggie Slowik is the Global Industry Director for Manufacturing at IFS.

The post Reshoring Alleviates Supply Chain Issues – But It Needs Tech to Control Costs appeared first on eWEEK.

]]>
Dell’s 2023 ESG Report: Evolving Corporate Culture https://www.eweek.com/it-management/dells-2023-esg-report-evolving-corporate-culture/ Wed, 19 Jul 2023 17:41:17 +0000 https://www.eweek.com/?p=222754 Environmental, Social and Governance (ESG) programs are anything but one-size-fits-all endeavors. Instead, most organizations work closely with stakeholders to ensure that programs align with their needs, carefully considering how factors affect business and internal and external relationships. This varies significantly according to industry, region and commercial markets. Plus, it is commonplace for ESG programs to […]

The post Dell’s 2023 ESG Report: Evolving Corporate Culture appeared first on eWEEK.

]]>
Environmental, Social and Governance (ESG) programs are anything but one-size-fits-all endeavors. Instead, most organizations work closely with stakeholders to ensure that programs align with their needs, carefully considering how factors affect business and internal and external relationships.

This varies significantly according to industry, region and commercial markets. Plus, it is commonplace for ESG programs to evolve as priorities and circumstances change.

Recently, Dell published its new ESG Report for FY2023, updating its achievements and overall strategy. Let’s consider how the company has progressed – but first let’s take a brief look at the state of enterprise ESG issues today.

Also see: Top Digital Transformation Companies

Today’s Corporate ESG Issues

It is worth noting the importance of ESG programs. The issues covered in these programs affect all of our lives and are closely tied to organizations’ relationships with stakeholders, including customers and strategic partners. Empowering disadvantaged groups of customers and businesses is just good for business.

That is especially true in the U.S. where despite their myriad benefits, ESG policies have become bugaboos of “wokeness” among some politicians and groups. Many of those individuals and alliances are also attempting to dial-back broader environmental and social justice advances but are encountering resistance from progressive organizations and individuals, as well as from seemingly unlikely organizations. Those include large corporations, pension funds, insurers and investment firms.

Why would those disparate players actively protect ESG programs? A couple of issues are top of mind. First, disadvantaging specific groups of consumers and businesses to appease politicians and special interest groups is simply bad for business.

Equally important are the negative impacts that anti-ESG efforts can have on promising businesses and industries. Consider that earlier this year, 19 Republican state governors signed an open letter warning of the “direct threat” posed by ESG proliferation. Some connect the ‘E’ in ESG to renewable energy technologies and programs, such as hydroelectric, wind power and ethanol subsidies for farmers. Since many or most of the governors who signed the letter lead states that benefit from renewable energy initiatives, their anti-ESG rhetoric seems ironic in the extreme.

Finally, and perhaps most importantly, is the value that ESG programs and strategies offer to companies doing business globally. Environmental, social and governance issues vary widely in importance and scope from place to place. The variety of ESG subject matter means that organizations can craft programs to maximize value for the customers and partners they believe are most in need.

Far from being the direct threat that some U.S. state governors and other politicians and groups imagine, ESG continues to deliver substantial, welcome benefits to businesses, state institutions and consumers worldwide.

Dell’s FY 2023 ESG Report

Dell Technologies has emphasized the importance of ESG-related issues since 1998 when the company published its initial Environmental Progress Report.

Beginning in 2002, the company shifted to annual reports charting its focus on and progress in key areas, including the environment, sustainability and corporate social responsibility. The company has maintained these commitments through recent political headwinds because it understands these priorities are not only good for business but also for the communities in which they operate.

What are some of the key highlights in Dell’s new FY2023 ESG report?

First, the company refined the goals included in the FY2022 report and condensed its 25 top-level goals to:

  • Achieve net zero greenhouse gas (GHG) emissions across Scopes 1, 2 and 3 by 2050.
  • Reuse or recycle one metric ton of materials for every metric ton of products Dell customers buy by 2030.
  • Make or utilize packaging made from recycled or renewable material for 100 percent of Dell products by 2030.
  • Leverage recycled, renewable or reduced carbon emissions materials in more than half of the products Dell produces by 2030.
  • Employ women as 50% of Dell’s global workforce and 40% of the company’s global people leaders by 2030.
  • Employ people who identify as Black/African American or Hispanic/Latino as 25% of Dell’s U.S. workforce and 15% of its U.S. people leaders by 2030.
  • Improve the lives of 1 billion people through digital inclusion by 2030 through efforts such as supply chain training and initiatives aimed at girls and women, or underrepresented groups.
  • Provide support for and participation in community giving or volunteerism by 75% of Dell team members by 2030.

Additionally, in 2022 Dell began framing a trust model centered on security, privacy and ethics. Given the importance of those areas in terms of establishing and maintaining trusted relationships, the company is emphasizing “Upholding Trust” with the goal of having customers and partners rate Dell Technologies as their most trusted technology partner.

Finally, the company demonstrated its continuing commitment to diverse supplier spend by doing over $3 billion in business with small and diverse companies. Plus, for the 13th consecutive year, Dell was recognized by the Billion Dollar Roundtable (BDR), which celebrates corporations that spend at least $1 billion annually with minority- and women-owned businesses.

Further details, background information and customer/partner examples can be found in the full Dell Technologies ESG Report for FY2023.

For more information, also see: What is Data Governance

Final Analysis

Transformation is a concept and process that permeates the technology industry, but it also has many guises. For example, there’s the “digital transformation” strategies and solutions that so many vendors emphasize aim to help customers improve business outcomes by maximizing compute performance and data efficiency. Other efforts include process transformation, such as leveraging automation and logistical efficiencies to improve supply chain performance.

One topic less commonly discussed is corporate cultural transformation. This is when an organization continually and proactively evolves to adapt and benefit from changes in commercial markets, business practices and demand forecasts, as well as shifts in politics, economies and the environment. In my opinion, this type of transformation holds a central role in Dell Technologies’ ESG strategy and its annual ESG reports.

Many of the practical steps the company is taking—expanding the use of recycled and renewable materials, for example—simply make good business and financial sense. Others, including achieving net zero GHG emissions, reflect the company’s deep understanding of and intention to practically address climate change and other environmental issues.

Some goals enumerated in the new FY2023 report may appear aspirational but are far more practical than one might expect. At a Dell Technologies World session a few years ago, Michael Dell noted (I confess to paraphrasing here) that, “A company should look like its customers and partners.”

That is a particularly profound statement, not to mention being highly applicable to business and a wide range of public and private organizations and institutions. Without having such a vision and investing in efforts to achieve it, individuals, businesses and governments will inevitably find their vision blurring, their frontiers shrinking and their opportunities dwindling.

By embracing cultural evolution through supporting and advancing the careers of underrepresented groups, by actively improving communities and the lives of a billion people and by working to become the vendor that customers and partners trust the most, Dell Technologies will further grow its own outlook, relevance and potential for success.

Is there a greater or more important goal for any organization?

For more information, also see: Digital Transformation Guide

The post Dell’s 2023 ESG Report: Evolving Corporate Culture appeared first on eWEEK.

]]>
Navigating the Perfect Storm with Applied Intelligence https://www.eweek.com/it-management/navigating-the-perfect-storm-with-applied-intelligence/ Wed, 21 Jun 2023 21:21:26 +0000 https://www.eweek.com/?p=222614 With budgets now tightening across corporate America, and the era of easy money a fast-fading memory, the time is nigh for achieving a long-sought goal in the world of business intelligence and analytics: closing the loop. As far back as 2001, at data warehousing firms like my old haunt of Daman Consulting, we touted the […]

The post Navigating the Perfect Storm with Applied Intelligence appeared first on eWEEK.

]]>
With budgets now tightening across corporate America, and the era of easy money a fast-fading memory, the time is nigh for achieving a long-sought goal in the world of business intelligence and analytics: closing the loop.

As far back as 2001, at data warehousing firms like my old haunt of Daman Consulting, we touted the value of “operationalizing” business intelligence. The idea was to leverage BI-derived insights within operational systems dynamically, and thus directly improve performance.

Though embedded analytics have been around for decades, it’s fair to say that most BI solutions in this millennium have focused on the dashboard paradigm: delivering high-level visual insights to executives via data warehousing, to facilitate informed decision-making.

But humans are slow, much slower than an AI algorithm in the cloud. In the time it takes for a seasoned professional to make one decision, AI can ask thousands of questions, get just as many answers, and then winnow them down to an array of targeted, executed optimizations.

That’s the domain of applied intelligence, a closed-loop approach to traditional data analytics. The goal is to fuse several key capabilities – data ingest, management, enrichment, analysis and decisioning – into one marshaling area for designing and deploying algorithms.

There are many benefits to this approach: transparency, efficiency, accountability; and most importantly in today’s market? Agility. During times of great disruption, organizations must have the ability to pivot quickly. And when those decisions are baked in via automation? All the better.

It also helps in the crucial domain of explainability, the capacity to articulate how an artificial intelligence model came to its conclusion. How explainable is a particular decision to grant a mortgage loan? How repeatable? What are the biases inherent in the models, in the data? Is the decision defensible?

On a related topic: The AI Market: An Overview

Take It To the Bank

The rise of fintech startups and neobanks, coupled with rapidly changing interest rates, has put tremendous pressure on traditional financial market leaders to innovate rapidly but safely. Rather than embrace a rear-guard strategy, many firms are looking to AI to regain momentum.

As CPTO for FICO, Bill Waid has overseen a wide range of banking innovations. UBS reduced card fraud by 74%, while Mastercard optimized fraud detection in several key ways, including automated messaging to solve the omni-channel conundrum of communications.

The Mastercard story demonstrates how a large financial institution is now able to dynamically identify, monitor, and manage client interactions across a whole host of channels – and fast enough to prevent money loss. A nice side benefit? Less-annoyed customers.

In a recent radio interview, Waid explained another situation where collaboration improves marketing. “In banking, from a risk perspective, one of the most profitable products is credit card. So if you were to ask somebody from risk: which would you push, it would be the credit card.”

But other departments may disagree. “If you ask the marketing person, they have all the stats and the numbers about the uptake, and they might tell you no, it’s not the credit card, at least not for this group (of customers), because they’re actually looking for a HELOC or an auto loan.”

The point is that you can drive away business by making the wrong suggestion. Without collaborating around common capabilities from a centralized platform, Waid says, that mistake would have likely gone into production, hurting customer loyalty and revenue.

With an applied intelligence platform, he says, key stakeholders from across the business all have their fingers in the pie. This helps ensure continuity and engagement, while also providing a shared baseline for efficacy and accountability.

Think of it as a human operating system for enterprise intelligence, one that’s connected to corporate data, predictive models, and decision workflows, thus achieving cohesion for key operational systems. In the ideal scenario, it’s like a fully functioning cockpit for the enterprise.

This transparency leads to confidence, a cornerstone of quality decision outcomes: “That confidence comes in two dimensions,” he says. “The first is: can you understand what the machine is doing? Do you have confidence that you know why it came to that prediction?

“The second element is that in order for the analytic to be useful, it’s gotta get out of the lab. And many times, I see that the analytic comes after the operationalization of a process, where there is more data, or a flow of data that’s well warranted to an analytic.”

For more information, also see: Best Data Analytics Tools

Bottom Line: The Analytic Becomes an Augmentation

This is where rubber meets road for applied intelligence: the analytic becomes an augmentation. And when the business has that transparency, they get comfortable, and they adopt the insight into their own operational workflow. That’s when the intended value out of the machine is felt.

“Platforms provide unification: bringing process, people, and tech together,” Waid says. And as AI evolves, with Large Language Models and quantum computing closing in, it’s fair to say that the practices of applied intelligence will provide critical stability, along with meaningful insights.

Also see: 100+ Top AI Companies 2023

The post Navigating the Perfect Storm with Applied Intelligence appeared first on eWEEK.

]]>
Sageable CTO Andi Mann on Observability and IT Ops https://www.eweek.com/enterprise-apps/sageable-observability-it-ops/ Tue, 20 Jun 2023 23:01:25 +0000 https://www.eweek.com/?p=222609 I spoke with Andi Mann, Global CTO & Founder of Sageable, about key points revealed in a upcoming report on digital transformation. He also highlighted trends in observability, DevOps, IT Ops and AIOps. Among the topics we covered:  Based on your latest research into Digital Transformation, what technologies are bubbling to the top? What key […]

The post Sageable CTO Andi Mann on Observability and IT Ops appeared first on eWEEK.

]]>
I spoke with Andi Mann, Global CTO & Founder of Sageable, about key points revealed in a upcoming report on digital transformation. He also highlighted trends in observability, DevOps, IT Ops and AIOps.

Among the topics we covered: 

  • Based on your latest research into Digital Transformation, what technologies are bubbling to the top?
  • What key trends are you seeing in Observability? Why is it getting so much attention?
  • AI is everywhere, and creeping into IT Ops too. How are ML and AI impacting IT Ops and DevOps today? What about the near future?
  • Looking ahead, what is the Next Big Thing for Ops?

Listen to the podcast:

Also available on Apple Podcasts

Watch the video:

The post Sageable CTO Andi Mann on Observability and IT Ops appeared first on eWEEK.

]]>