menu
techminis

A naukri.com initiative

google-web-stories
Home

>

Big Data News

Big Data News

source image

Siliconangle

1M

read

130

img
dot

Image Credit: Siliconangle

Big Tech escapes tariffs’ impact for now, but investors are wary — and should be

  • Investors are relieved that Trump's tariffs haven't impacted tech company earnings, with Alphabet, SAP, ServiceNow, and most chipmakers managing to avoid the effects.
  • Concerns arise as IBM's stock drops due to exposure to tariff cuts, Intel struggles with layoffs and disappointing guidance under new CEO Lip-Bu Tan.
  • Artificial intelligence adoption by enterprises is progressing slowly due to challenges in data foundations, governance, and software development tools.
  • Startups are offering solutions to facilitate AI adoption as big tech faces regulatory challenges and struggles with metaverse initiatives.
  • Upcoming earnings reports from major tech companies like Amazon, Apple, Microsoft, and Alphabet, along with developments in AI technologies and services are anticipated.
  • Cybersecurity sector sees significant funding and new services, with firms like Chainguard raising $356 million and AI tools company Check Point introducing new services.
  • Various AI-related initiatives and advancements are announced by companies like Datadog, Supabase, Adobe, and Microsoft, signaling ongoing innovation in the sector.
  • Tech earnings remain robust, with companies like Alphabet, IBM, and ServiceNow beating forecasts, while challenges like trade wars impact Intel's stock.
  • Policy developments include FTC suing Uber, concerns over DOGE impact, and criticism of Palantir's contract with ICE.
  • In cybersecurity, developments focus on enhancing threat detection, cloud security, and AI-powered solutions to address cyber risks facing enterprises.

Read Full Article

like

7 Likes

source image

Amazon

1M

read

314

img
dot

Image Credit: Amazon

Amazon SageMaker Lakehouse now supports attribute-based access control

  • Amazon SageMaker Lakehouse now supports attribute-based access control (ABAC) with AWS Lake Formation, using IAM principals and session tags for simplified data access and maintenance.
  • Attribute-based access control (ABAC) in SageMaker Lakehouse allows organizations to create dynamic access control policies based on business attributes associated with user identities.
  • SageMaker Lakehouse provides unified access to various data sources like Amazon S3, Redshift, DynamoDB, and supports querying using services like Redshift, Athena, EMR, and Glue.
  • ABAC offers flexibility in managing access rules, reducing administrative overhead by handling a smaller number of roles, and scalability for larger enterprises with numerous users and resources.
  • ABAC grants permissions based on user attributes and is context-driven, enabling administrators to restrict data access based on specific user attribute keys and values.
  • By using ABAC, organizations can reduce the number of roles required, grant access based on attributes like department or country, and easily create and maintain data access grants.
  • The implementation involves defining user attributes using IAM tags, setting access policies in Lake Formation, and granting permissions based on the predefined criteria.
  • The solution architecture includes defining attributes, setting up ABAC policies, and granting database and table permissions to users based on their attributes.
  • Users like data analysts, BI analysts, and data scientists can leverage ABAC in SageMaker Lakehouse for specific data access based on their roles and attributes.
  • The post demonstrates setting up ABAC for Example Retail Corp, granting specific data access permissions to different user personas, and utilizing analytics services like Athena, Redshift, and EMR.

Read Full Article

like

18 Likes

source image

Precisely

1M

read

359

img
dot

Image Credit: Precisely

Is Your Data Understood and Compliant? Here’s How to Fix It

  • Without shared data definitions, ownership, and built-in compliance, data becomes a liability.
  • Business-friendly governance and stewardship frameworks enable teams to trust, manage, and use data confidently.
  • Start with small steps like clear roles, goals, glossaries, and workflows, and scale towards proactive compliance and increased data visibility.
  • Embedding governance into daily operations empowers teams, eliminates inefficiencies, enables innovation, and reduces risk.

Read Full Article

like

21 Likes

source image

Siliconangle

1M

read

579

img
dot

Image Credit: Siliconangle

Datadog snaps up Metaplane to improve data quality for AI applications and systems

  • Datadog Inc. has acquired startup Metaplane to expand its data observability offerings.
  • Metaplane's tools will help Datadog users identify and remedy data quality issues in critical business applications.
  • Metaplane's platform combines AI and column-level lineage to detect, resolve, and prevent data quality problems.
  • The acquisition aims to strengthen Datadog's observability capabilities and enable the building of reliable AI systems.

Read Full Article

like

6 Likes

source image

TechBullion

1M

read

436

img
dot

Image Credit: TechBullion

The Strategic Imperative of Data Archiving: Beyond Storage

  • Organizations are recognizing data archiving as a strategic imperative due to compliance requirements, operating costs, and unstructured data.
  • Data archiving involves moving inactive data out of live systems for long-term preservation, accessibility, and compliance.
  • Benefits of data archiving include compliance readiness, cost reduction, improved application performance, and risk reduction.
  • Archiving challenges include technological obsolescence, data integrity, security vulnerabilities, and retrievability.

Read Full Article

like

26 Likes

source image

Siliconangle

1M

read

18

img
dot

Image Credit: Siliconangle

Relyance AI’s new data tracking tool paves the way for greater AI accountability and explainability

  • Relyance AI Inc. introduces Data Journeys, a platform focused on data tracking across an organization's systems to enhance AI accountability and explainability.
  • Data Journeys tracks the data flow from its origin to destination, as well as its use by various applications and services within the organization.
  • Relyance AI received $32 million in Series B funding last October and is known for its governance platform providing visibility into enterprise-wide data and controls for data protection.
  • By scanning all data sources, Data Journeys ensures compliance with company policies and regulations to maintain data integrity.
  • The platform offers a more comprehensive view than traditional data lineage tools by tracking data movements across different systems, applications, and AI models.
  • Enterprises face pressure from regulators, with AI regulation highlighted as a risk by over 25% of Fortune 500 firms, necessitating enhanced visibility into data processes.
  • Data Journeys is particularly beneficial for regulated industries like healthcare, with potential to revolutionize AI development and governance frameworks.
  • The platform addresses major impediments to enterprise AI adoption, including risk management, bias detection, explainability, and regulatory compliance.
  • Enhanced explainability provided by Data Journeys is crucial for AI decision-making in sensitive areas like loan approvals and medical diagnoses.
  • Relyance AI aims to build a unified AI-native platform for data governance and compliance, with Data Journeys playing a critical role in ensuring trust and accountability in AI systems.

Read Full Article

like

1 Like

source image

Amazon

1M

read

252

img
dot

Image Credit: Amazon

Accelerate data pipeline creation with the new visual interface in Amazon OpenSearch Ingestion

  • Amazon OpenSearch Ingestion is a managed serverless pipeline for ingesting, filtering, transforming, and routing data.
  • The new visual interface simplifies pipeline creation and management from the AWS Management Console.
  • Key improvements include a guided visual workflow, automatic permission setup, and real-time validation checks.
  • The visual interface allows for automatic discovery of sources and sinks, eliminating manual resource configuration.
  • Automated IAM role management simplifies security setup for pipelines, reducing the risk of errors.
  • Real-time validation capabilities help identify and resolve issues during pipeline creation.
  • The visual interface streamlines source and processor configuration, presenting error messages for any issues.
  • Sink configuration offers flexibility for choosing OpenSearch destination and mapping options.
  • The interface updates YAML/JSON configuration in real time as users make selections.
  • Automated IAM role management across pipeline components saves time and reduces permission-related errors.

Read Full Article

like

15 Likes

source image

Siliconangle

1M

read

428

img
dot

Image Credit: Siliconangle

Sumer Sports brings AI analytics to the NFL

  • Sumer Sports is using AI-based video analytics to study every player on every play in the NFL to evaluate decision-making and success.
  • The company combines AI with seasoned football professionals to ensure accurate data interpretation.
  • Sumer Sports is working with 20 NFL veterans to bridge the gap between data science and football expertise.
  • The AI platform focuses on roster construction, player evaluation, and improving decision-making for NFL teams.
  • Sumer Sports emphasizes the importance of combining subjective and objective information with frame-level analysis to evaluate player performance.
  • The company's AI platform is being utilized by several NFL and NCAA teams, but there is some resistance from traditionalists in the industry.
  • Scott Pioli acknowledges that Sumer Sports represents a challenge in the industry and emphasizes the need for a shift in mindset towards AI-based tools in football.
  • The potential for Sumer Sports extends beyond football into high school sports, training tools, and other sports, with a current focus on collegiate and professional football.
  • Sumer Sports has created a free draft guide for those following the NFL draft, offering additional insights into player rankings.
  • Zeus Kerravala, a principal analyst at ZK Research, wrote this article for SiliconANGLE on how Sumer Sports is revolutionizing AI analytics in the NFL.

Read Full Article

like

25 Likes

source image

Siliconangle

1M

read

423

img
dot

Image Credit: Siliconangle

SumerSports brings AI analytics to the NFL

  • Scouting in the NFL is an imperfect science due to various factors like competition level and visibility of players.
  • SumerSports utilizes AI-based video analytics to evaluate player decisions and performance at a detailed level.
  • The company works with NFL veterans and data scientists to combine expertise and data interpretation.
  • SumerSports focuses on optimized roster construction, player evaluation, and decision-making support using AI.
  • The platform offers insights to teams for identifying undervalued players, evaluating player skills, and making strategic decisions.
  • SumerSports differs by combining subjective and objective information and providing frame-level insights for each player on every play.
  • Despite some resistance, several NFL and NCAA teams are already using SumerSports' AI platform.
  • There is potential for AI adoption across all sports, with SumerSports considering expansion to high school football and training tools.
  • The company offers a free draft guide with player rankings and insights for football enthusiasts.
  • SumerSports is positioned to revolutionize player evaluation and decision-making processes in the world of football.

Read Full Article

like

25 Likes

source image

Towards Data Science

1M

read

159

img
dot

MapReduce: How It Powers Scalable Data Processing

  • MapReduce is a programming model introduced by Google to enable large-scale data processing in a parallel and distributed manner across compute clusters.
  • Tasks in MapReduce are divided into map and reduce phases, where map processes individual data records and reduces aggregates values for distinct keys.
  • MapReduce computation is distributed across a cluster with a master handling task scheduling and workers executing map and reduce tasks.
  • The MapReduce model is suitable for parallelizing data transformations on distinct data partitions followed by aggregation.
  • MapReduce was initially used by Google to build indexes for its search engine and is applicable to various data processing tasks.
  • MapReduce jobs involve partitioning data, executing map tasks in parallel, sorting key-value pairs, and aggregating results in reduce tasks.
  • MapReduce has influenced modern frameworks like Apache Spark and Google Cloud Dataflow with its fundamental distributed programming concepts.
  • While MapReduce introduced key distributed programming concepts, modern frameworks like Spark have evolved to offer more flexibility and efficiency.
  • The MapReduce model, though not commonly used today, played a significant role in the design of current distributed programming frameworks.
  • MapReduce tasks can be expressed using libraries like mrjob, simplifying the writing of mapper and reducer logic for data transformation.

Read Full Article

like

8 Likes

source image

Siliconangle

1M

read

374

img
dot

Image Credit: Siliconangle

Supabase reels in $200M for its open-source relational database

  • Supabase, an open-source relational database developer, has raised $200 million in funding at a $2 billion valuation.
  • The round included participation from Accel, Coatue, Y Combinator, Craft Ventures, and Felicis, along with several angel investors.
  • Supabase's platform offers an alternative to Google's Firebase database, claiming to be faster and easier to set up.
  • The company has seen rapid growth, with a user base of over 2 million developers and more than 3.5 million database environments.

Read Full Article

like

22 Likes

source image

Siliconangle

1M

read

22

img
dot

Image Credit: Siliconangle

AI, outages and scale: Inside Dynatrace’s take on navigating digital chaos

  • Businesses must address growing digital complexities in a systematic way to remain competitive.
  • Managing digital complexities as a strategic advantage is crucial for businesses.
  • Dynatrace helps organizations find and fix issues in their digital ecosystems before they disrupt service.
  • AI capabilities like anomaly detection and root cause analysis have long supported organizations in optimizing operations.

Read Full Article

like

Like

source image

Siliconangle

1M

read

384

img
dot

Image Credit: Siliconangle

Ocient raises $42.1M more for its speedy data analytics platform

  • Ocient, a startup with a speedy data analytics platform, has raised $42.1 million in funding.
  • The investment marks the second extension of the company's Series B round, with the total raised now at nearly $90 million.
  • Ocient's platform allows analysis of historical and real-time data in one place, removing the need for multiple analytics environments.
  • The platform offers high query performance, advanced compression methods, and managed environment options for customers.

Read Full Article

like

23 Likes

source image

Amazon

1M

read

356

img
dot

Image Credit: Amazon

Read and write Apache Iceberg tables using AWS Lake Formation hybrid access mode

  • Enterprises are adopting Apache Iceberg table format for its features like CDC, ACID compliance, and schema evolution.
  • AWS Lake Formation allows managing fine-grained data access permissions centrally and scaling data access within and outside organizations.
  • Lake Formation hybrid access mode enables using IAM policy-based permissions for write workloads and Lake Formation permissions for read access to Iceberg tables in Amazon S3.
  • Use cases for Lake Formation hybrid access mode include avoiding data replication, minimal interruption to existing IAM policy-based user access, and supporting transactional table writes.
  • Key steps in setting up permissions involve registering data locations, granting permissions to roles, and verifying permissions for Data-Analyst and IAMAllowedPrincipals.
  • Creation of IAM roles, Iceberg tables, opt-in to hybrid access mode, and testing table access as Data-Analyst in Athena are critical steps in the setup.
  • Using Amazon EMR Studio for upsert operations on Iceberg tables, verifying data updates, and cleaning up resources post-use are essential parts of the process.
  • Hybrid access mode in Lake Formation allows gradual adoption of Lake Formation permissions alongside IAM-based permissions for different use cases, ensuring flexible access control.
  • The approach demonstrated can be extended to other open table formats and Data Catalog tables, providing organizations with control over schema and data updates.
  • Authors include Aarthi Srinivasan, a Senior Big Data Architect, and Parul Saxena, a Senior Big Data Specialist Solutions Architect, both with expertise in AWS Lake Formation and big data solutions.
  • The methodology outlined encourages experimentation and adoption of Lake Formation permissions while maintaining control over data operations through IAM policies.

Read Full Article

like

21 Likes

source image

Precisely

1M

read

108

img
dot

Image Credit: Precisely

What Will the CDO of the Future Look Like?

  • The article discusses the future evolution of the Chief Data Officer (CDO), transitioning to a hybrid role combining strategy, innovation, and human engagement.
  • The CDO role is compared to the historical evolution of the HR Director, indicating a shift from enforcing rules to implementing strategies aligned with company objectives.
  • There is a focus on humanizing the CDO role to promote its evolution and importance in successful digital transformation.
  • Communication skills, the ability to speak the language of business, and promoting data culture are highlighted as key for the CDO's success.
  • The future CDO is envisioned as a hybrid model merging technological expertise with humanistic sensitivity and the ability to integrate into executive teams.
  • Skills necessary for the future CDO include expertise in technology, communication, forming alliances with HR Directors and CIOs, and strategic thinking.
  • The article contemplates if the CDO will become a permanent strategic pillar within organizations and suggests potential renaming to reflect a broader scope, like Chief Data & AI Officer.
  • The ideal future CDO is described as an empathetic tech expert who can connect technology, human values, and collective engagement for the benefit of all employees.
  • The discussion concludes that by 2050, CDOs will play a crucial role in digital transformation, shaping a future where technology and humanity work together for corporate strategy.
  • Advice for future CDOs includes staying true to data value extraction, developing communication skills, and embracing a hybrid role focused on serving both technology and human needs.

Read Full Article

like

6 Likes

For uninterrupted reading, download the app