menu
techminis

A naukri.com initiative

google-web-stories
Home

>

Databases

Databases

source image

Dev

1M

read

27

img
dot

Image Credit: Dev

JSON: A Technical Overview

  • JSON, or JavaScript Object Notation, is widely used for data interchange due to its simplicity and readability.
  • In the realm of blockchain technology, JSON plays a vital role in structuring and storing data, ensuring integrity and simplifying retrieval.
  • Data compression and archiving utilities, like IZArc, utilize JSON for configuration management and metadata storage, providing a user-friendly experience.
  • Note-taking and information management applications, such as Evernote, use JSON for efficient data storage and synchronization across multiple devices.

Read Full Article

like

1 Like

source image

Analyticsindiamag

1M

read

195

img
dot

Image Credit: Analyticsindiamag

Oracle is Weaving AI into Every Layer of Tech

  • Oracle is embedding AI across every layer of its technology stack.
  • The company has built a supercluster with 130,000 GPUs for AI training.
  • Oracle's AI-ready platform allows users to interact with databases in natural language and extract insights instantly.
  • Oracle is incorporating AI agents into its SaaS applications across ERP, HCM, SCM, and CX.

Read Full Article

like

11 Likes

source image

Dbi-Services

1M

read

436

img
dot

So much praise and honor after mastering complex data security challenges

  • The dbi services team receives high praise and honor for successfully addressing complex data security challenges in a project for Galenica.
  • The team initially had doubts and faced various obstacles, but their dedication, curiosity, creative problem-solving, and technical expertise helped them exceed project goals.
  • Other key factors for project success include flexibility, collaboration, listening, and maintaining a positive attitude.
  • Overall, the team's achievements demonstrate the importance of these qualities in achieving excellence in project execution.

Read Full Article

like

26 Likes

source image

Amazon

1M

read

22

img
dot

Image Credit: Amazon

Timestamp writes for write hedging in Amazon DynamoDB

  • The article discusses how to enforce client-side timestamp-based write sequence order on Amazon DynamoDB to prevent lower timestamps from overwriting higher ones even with out-of-order requests or retries.
  • By using timestamp-based write sequencing, where each write includes a ConditionExpression checking new item timestamp against existing item, the article ensures correct write order.
  • The post explains using hedging techniques for faster read latencies and how the same concept can be applied to write requests, ensuring that subsequent writes are treated based on the initial timestamp.
  • It highlights the importance of timestamp attributes for each item to decide write validity time, even in scenarios like data stream processing or SDK retries.
  • Deletion handling involves utilizing tombstones to mark soft deletes, with an example demonstrating how to handle deletions effectively.
  • The article provides a Python code sample for performing PutItem with conditional expressions to manage write order based on timestamps.
  • It emphasizes the importance of managing tombstones and utilizing TTL attribute for orchestrating deletions in a DynamoDB table.
  • The technique is applicable for PutItem and DeleteItem operations but not for BatchWriteItem due to lack of condition support, and UpdateItem can be used cautiously based on the item section being updated.
  • Cost analysis suggests minimal additional cost with this design, emphasizing accurate timestamp values and mitigating clock skew for reliable write sequence enforcement.
  • In conclusion, the article provides a comprehensive guide on leveraging timestamp attributes and condition expressions in DynamoDB for maintaining write sequence integrity and handling deletions efficiently.

Read Full Article

like

1 Like

source image

Dbi-Services

1M

read

200

img
dot

Cleanup Oracle After Patching

  • To clean up Oracle-Home(s) after applying quarterly Release Updates, you have two options for Out-Of-Place patching.
  • Option 1: Set the environment for the Oracle-Home you want to delete and run the deinstall command.
  • Option 2: If deinstall does not work, find the Home-Name for the Oracle-Home to deinstall, set the environment for the Oracle-Home to remove and DETACH it.
  • Option 3: For IN-Place patching, check the size of the .patch_storage subdirectory to see the disk usage, list and delete obsolete patches using opatch.

Read Full Article

like

12 Likes

source image

Dev

1M

read

127

img
dot

Image Credit: Dev

Choice of Table Column Types and Order When Migrating to PostgreSQL

  • When migrating to PostgreSQL, selecting appropriate column types and optimizing their order is crucial for maximizing performance and storage efficiency.
  • For numeric types, consider the range and precision requirements and choose the appropriate integer or decimal type.
  • Character types such as VARCHAR, TEXT, and CHAR can be chosen based on the desired length and padding requirements.
  • Implementing advanced techniques like partial indexes, BRIN indexes, and UNLOGGED tables can further optimize PostgreSQL performance.

Read Full Article

like

7 Likes

source image

Dev

1M

read

327

img
dot

Image Credit: Dev

Flexible interval aggregation:From SQL to SPL

  • Flexible interval aggregation: From SQL to SPL
  • This article discusses the concept of flexible interval aggregation and compares the implementation in SQL and SPL (Stream Processing Language).
  • SQL implementation requires creating a temporary interval table and then associating and grouping the data.
  • On the other hand, SPL code simplifies the process by directly grouping and aggregating data based on intervals.

Read Full Article

like

19 Likes

source image

Amazon

1M

read

100

img
dot

Image Credit: Amazon

Simplify database authentication management with the Amazon Aurora PostgreSQL pg_ad_mapping extension

  • Authentication serves as the foundational pillar of security in any enterprise environment, playing a pivotal role in safeguarding sensitive data and resources from unauthorized access.
  • This post explores the use of Kerberos authentication for Amazon Aurora PostgreSQL-Compatible Edition using AWS Directory Service for Microsoft Active Directory and particularly the new pg_ad_mapping extension.
  • Kerberos authentication offers centralized authentication and single sign-on (SSO) benefits, along with the use of short-lived tickets for enhanced security.
  • Aurora PostgreSQL authentication offers password authentication, AWS Identity and Access Management database authentication, and Kerberos authentication. Each method operates independently.
  • Kerberos authentication on Amazon RDS and Aurora can be used in conjunction with AWS Managed Microsoft AD.
  • Prior to versions 14.10 and 15.5, Amazon Aurora PostgreSQL supported only Kerberos-based authentication with AD for individual users.
  • In addition to AD user authentication, AWS provides an enhanced access control mechanism by integrating with AD security groups using the pg_ad_mapping extension.
  • The Aurora PostgreSQL pg_ad_mapping extension streamlines access management and mapping of AD security groups to database roles.
  • The solution harnesses the capabilities of the pg_ad_mapping extension to empower groups of enterprise users from an AWS Managed Microsoft AD server.
  • Security best practices for Aurora PostgreSQL are discussed.

Read Full Article

like

6 Likes

source image

Dev

1M

read

250

img
dot

Image Credit: Dev

Exploring Subquery Alternatives: Understanding and Using CTE

  • Subqueries are queries nested within another SQL query, often used to perform intermediate calculations or data filtering.
  • Common Table Expressions (CTEs) are temporary named result sets defined within a SQL statement using the WITH keyword.
  • Subqueries are hard to read, challenging to maintain, and sometimes frustratingly slow. Debugging deeply nested queries, in particular, was a headache.
  • CTEs simplify complex queries by breaking them into smaller, reusable components, improving readability and maintainability.
  • In this article, the author shared a case study to highlight the differences between subqueries and CTEs.
  • The case study involves managing financial data related to sellers, specifically their transactions and withdrawals.
  • The article shows a subquery approach and a CTE approach to solve the problem of calculating the running balance for the selected seller_id.
  • The subquery approach calculates the running balance using a correlated subquery, which is executed repeatedly for each row in the outer query leading to significant performance overhead.
  • The CTE-based approach efficiently calculates the cumulative balance_after using a window function that processes the dataset in a more optimized manner compared to a correlated subquery.
  • A recommendation is made that the CTE-based approach should be used in scenarios where performance is critical, especially with larger datasets.

Read Full Article

like

15 Likes

source image

Medium

1M

read

0

img
dot

Database Isolation Levels: Understanding Transaction Consistency and Concurrency

  • The lowest level of isolation is generally not used.
  • Read Committed isolation level allows non-repeatable read and phantom read.
  • Repeatable Read isolation level prevents non-repeatable read but still allows phantom read.
  • Multiple versions of the same record may exist in the MVCC approach.

Read Full Article

like

Like

source image

Siliconangle

1M

read

360

img
dot

Image Credit: Siliconangle

Trump appoints VP JD Vance to oversee TikTok negotiations as US ownership talks intensifty

  • U.S. President Donald Trump has appointed Vice President JD Vance to oversee the negotiations on a deal to save the operations of TikTok in the U.S.
  • TikTok, owned by ByteDance Ltd., has been in a legal limbo since being banned in the U.S. on Jan. 19.
  • Potential buyers for TikTok's U.S. operations include Oracle Corp., Microsoft Corp., and Amazon.com Inc.
  • TikTok and its backers are exploring the option of a joint venture with U.S. investors to address data security concerns.

Read Full Article

like

21 Likes

source image

Siliconangle

1M

read

225

img
dot

Image Credit: Siliconangle

Trump appoints VP JD Vance to oversee TikTok negotiations as US ownership talks intensify

  • U.S. President Donald Trump has appointed Vice President JD Vance to oversee negotiations for a deal to save TikTok's operations in the U.S.
  • TikTok, owned by ByteDance Ltd., has been banned in the U.S. but was granted a temporary reprieve by President Trump.
  • Potential suitors for TikTok's U.S. operations include Oracle Corp., Microsoft Corp., and Amazon.com Inc.
  • TikTok is also exploring options to keep American user data separate, potentially through a joint venture with U.S. investors.

Read Full Article

like

13 Likes

source image

Hrexecutive

1M

read

268

img
dot

From overwhelm to clarity: Democratizing people analytics for HR leaders

  • Paradox, an AI-powered recruiting platform, has acquired people analytics provider Eqtble to offer deeper insights and automation for talent management.
  • 93% of companies are increasing or maintaining investment in people analytics, but confusion and limited expertise remain challenges in the market.
  • HR professionals are in need of more accessible and straightforward access to insights that drive meaningful change.
  • Workday is laying off employees, while other HR tech companies make acquisitions and introduce AI agents to improve HR processes and employee experience.

Read Full Article

like

16 Likes

source image

Siliconangle

1M

read

392

img
dot

Image Credit: Siliconangle

Investors cool on cloud but CEOs double down

  • Earnings reports from the big three cloud players disappointed investors this week. As is often the case, when a new wave hits, it tends to be overhyped at the beginning of the cycle and underestimated at the eventual steady state. The infrastructure-as-a-service and platform-as-a-service revenue alone for the Big Three approached $200 billion in 2024 and grew 25%. All three cited capacity constraints and, along with Meta Platforms Inc., are committing more than $300 billion in capital spending this year, most of it to support current and future AI demand.
  • Despite investor expectations, the hyperscalers Microsoft, Alphabet and Amazon all experienced triple-digit AI growth and cited capacity constraints. The capex required to pursue long-term AI benefits created an environment in which the hyperscalers were doubling down investments at a time when more traditional investors wanted faster returns. The current environment is marked by short-term skepticism toward cloud investments, yet executive teams are doubling down on AI-driven strategies with the expectation of massive future returns.
  • The Big Three cloud players and Alibaba accounted for $210 billion in revenue in 2025; a 24% growth on a $200 billion market. With Azure and GCP growing in the mid-30s, as you may recall in 2023, we adjusted our Azure figures to strip out some of the non-cloud revenue. In this context, investors will likely begin evaluating long-term opportunity instead of the short-term growth challenges and uncertainties.
  • The hyperscaler cloud market is likely to expand to above $425 billion by the end of this decade, with AWS maintaining 50% of the market but expected to drop below 50% this year. In January 2023, Microsoft’s Azure AI was ahead of Google, with 175N to Google’s 127 (with Google as a percentage of AWS at the time of 72%). AWS operates at $115 billion annual rate, yet is capacity-constrained by chips and certain other components.
  • The data indicates that Google is rapidly closing the AI gap with AWS when measured in account penetration and seems to underscore Google's strong technology platform. Despite the current skepticism toward cloud spending, the imperative to capture unprecedented AI opportunities drives continued high levels of investment. Firms that successfully manage near-term cost concerns while positioning for AI dominance stand to reap substantial rewards, validating the cloud’s indispensable role in the AI revolution.
  • Overall, the potential payoffs and efficiency gains offered by AI are compelling, which encourages CEO's to double down on investments amid short-term financial pressures. As cloud services mature, the next wave of AI capabilities can emerge, which might become the leading differentiator in the market giving firms a contemporary edge.
  • The data illustrates that, unlike typical market scenarios where a single leader dominates revenue, three major hyperscalers can simultaneously generate strong returns, supported by the market's vast size. Multiple major vendors, including AWS, Microsoft Azure, and Google, can thrive thanks to the sector's scale, with additional players such as Oracle, IBM, Snowflake, and Databricks carving out niches.
  • Google cites its strong tech stack as giving it a competitive edge. The cloud ecosystem accommodates specialized firms such as Snowflake and Databricks, which build on top of the cloud, which further reinforces the ecosystem's flexibility and breadth.
  • Firms that invest prudently in both cloud and AI today are positioned to realize outsized gains in the future. The cloud market's size means three companies can dominate and make a lot of money, and, as the emerging consensus suggests, DeepSeek-like cost reductions in AI tooling are likely to accelerate adoption building on the cloud foundation.
  • The need to capture unprecedented AI opportunities drives continued high levels of investment in the cloud market, and firms successfully positioning for AI dominance stand to reap substantial rewards. Despite the current skepticism toward cloud spend and the short-term earnings sentiment, sumptuous potential and upside opportunities — particularly in driving operational efficiencies — are too significant for CEOs to overlook, highlighting the cloud's deep-seated position in the AI revolution.

Read Full Article

like

23 Likes

source image

Dbi-Services

1M

read

301

img
dot

Image Credit: Dbi-Services

Customer case study – SQL Server table partitioning

  • A client submitted the following issues about their SQL Server: their database hosts multiple tables, including one table of approximately 1 TB, consisting of several billion rows; the client wanted to archive certain data; and, they also wanted to reduce the duration of maintenance jobs for indexes, statistics, and those responsible for verifying data integrity.
  • They decided to partition the large table, as partitioning a large dataset can enhance query performance and simplify maintenance. In their case, they decided to create a new table that is an exact copy of the source table but is partitioned.
  • In this partitioned table, they created corresponding filegroups and used a partition function based on a datetime column to determine the partition ID according to the date value.
  • Creating a partition scheme maps the partition ID to the filegroups, and finally, they copied data from the source table to the new partitioned table and built the indexes.
  • Once the bulk of the data was copied, they were able to retrieve the delta data and switch to the new partitioned table without affecting applications that use it. In this way, they were able to process queries more efficiently and maintain the table more easily.
  • They used Ola Hallengren's solution for database maintenance, which allowed them to verify data integrity and filter by filegroup. For index maintenance, they used a cursor and controlled partition by partition by dynamically generating SQL queries.
  • Finally, they were able to create a job that checks data integrity and filter by filegroup and partition maintenance so that they only need to check or maintain relevant partitions.

Read Full Article

like

18 Likes

For uninterrupted reading, download the app