menu
techminis

A naukri.com initiative

google-web-stories
Home

>

Databases

Databases

source image

Dev

1M

read

297

img
dot

Image Credit: Dev

Custom Data Types

  • PL SQL is a strongly typed language. For every variable used in a program or subprogram, the variable must be declared with a data type. PL SQL comes with some data types already defined.
  • Scalars are atomic data types such as NUMBER, BOOLEAN, VARCHAR, and many more while composite data types consist of one or more scalars. Examples of composite data types are record types, collection types, and object types.
  • Subtypes do not introduce a new type. They, however, place optional constraints on a base type. They generally improve the readability and reliability of the code.
  • Composite data types can take 3 forms which are record types, object types, and collection types. Internal components in the composite data type could either be a scalar or another composite data type. Internal components can be accessed using a dot notation.
  • Object types can have 3 components which are attributes, name, and methods. Method keywords could be preceded with STATIC or COMPARISON. Object types may require a body like packages in PL/SQL if the object has methods.
  • There are 3 types of collections which are associative arrays, variable-sized arrays (VARRAYS), and nested tables.
  • Subtypes, records, and collections are usually not preceded with the CREATE keyword when declared. They are used and declared in packages and subprograms.
  • Custom data types could be instrumental in making your code more optimal, better performing, and readable.
  • Live SQL platform can be used to practice these concepts.
  • Custom data types in PL SQL are Scalar and Composite data types. Scalar categories can be further constrained using Subtypes. Composite data types are user-defined types made from one or more scalar data types. They are created by a user, and allow us to create abstractions of real-world objects, similar to other object-oriented programming languages. The three educational pieces to composite types in PL/SQL are nested tables,(varrays) variable sized arrays, and associative arrays.

Read Full Article

like

17 Likes

source image

Amazon

1M

read

316

img
dot

Image Credit: Amazon

2024: A year of innovation and growth for Amazon DynamoDB

  • Amazon DynamoDB witnessed various advancements in security, performance, cost-effectiveness and integration capabilities in 2024.
  • Some of the key updates include significant price reductions for on-demand throughput and global tables, introduction of warm throughput, and multi-region strong consistency public preview for global tables.
  • Enhanced features like AWS PrivateLink support, resource-based policies, and attribute-based access control have made DynamoDB more secure, resilient and flexible.
  • With updates like zero-ETL integrations with Amazon Redshift and Amazon SageMaker Lakehouse, customers can build faster and more reliable applications.
  • DynamoDB Local now supports configurable maximum throughput for on-demand tables and indexes, making it easier to validate API actions before releasing code.
  • Published documentation that goes in depth about high-level, low-level libraries and addresses the most common configuration settings when using Python SDK.
  • Michael Shao is a Senior Developer Advocate on the Amazon DynamoDB team.
  • Looking forward to 2025, DynamoDB encourages its customers to try new features and share their experiences with them at @DynamoDB or on AWS re:Post.
  • To learn more about these new upgrades, check out the Amazon DynamoDB Developer Guide and the DynamoDB getting started guide.
  • The improvements suggest that DynamoDB is committed to providing a reliable, resilient, cost-effective, and data-driven storage solution for all businesses of different sizes.

Read Full Article

like

19 Likes

source image

Amazon

1M

read

352

img
dot

Image Credit: Amazon

Gather organization-wide Amazon RDS orphan snapshot insights using AWS Step Functions and Amazon QuickSight

  • Organizations can now easily search through multiple linked accounts and AWS regions to identify manual snapshots without an associated RDS instance or cluster created using AWS Step Functions with AWS Lambda functions via AWS Insights.
  • The solution generates orphan snapshot metadata across organizations using AWS Step Functions in conjuncture with AWS Lambda functions to create orphan snapshot metadata for all accounts for user-defined regions based on deployment architecture.
  • The generated metadata information from the orphan snapshots are stored in Amazon Simple Storage Service (S3) and transformed into Amazon Athena tables by AWS Glue, which Amazon QuickSight then uses to generate orphan snapshot insights.
  • The solution consists of three deployments using AWS CloudFormation stacks:
  • Monitoring account deployment (Steps 1–2 detailed in this post)
  • Management account deployment (Steps 3–4) A (optional)
  • Post-solution deployment (Steps 5–7).
  • The cost of the solution varies based on the number of accounts, regions, and Step Functions workflow run frequency, with an approximate estimate of $20 per month for eight accounts and three regions.
  • When testing this solution in the us-east-1 Region, it may prove helpful in supporting use cases from individual accounts to organizational-wide deployments, from identifying the oldest RDS snapshot of an instance to RDS with largest active snapshots.

Read Full Article

like

21 Likes

source image

Hackernoon

1M

read

311

img
dot

Image Credit: Hackernoon

Optimizing Oracle Database Queries Using Execution Plans: A Step-by-Step Guide

  • The article discusses the importance and benefits of generating execution plans in Oracle databases to diagnose and improve SQL query performance. A comprehensive guide to generate, interpret, and optimize execution plans is provided, with practical examples, actionable insights, and advanced tuning strategies for large-scale datasets.
  • The guide covers strategies to generate execution plans using three methods - EXPLAIN PLAN, Oracle SQL Developer (GUI) and Capture Actual Execution Statistics.
  • Interpreting Execution Plans can be done through the Plan Structure and Hierarchy of the execution plan. Execution plans are tree structures with child operations feeding data to parent operations. The article details key metrics, cost models, and important red flags to look out for.
  • The guide provided a list of common performance bottlenecks like Full Table Scans, High Join Costs, Sorting Overheads, and Cardinality Mismatches. They also provided detailed solutions for every bottleneck.
  • Query Tuning Strategies - Optimization of INDEX, SQL Rewrites, and Optimizer Hints are discussed in detail.
  • The article concluded with a practical optimization example and best practices to follow like Indexing Strategically i.e. align indexing with query predicates, regularly update statistics for accurate cardinality estimates and incremental testing of changes with A/B plan comparisons.
  • Advanced tips were also shared, including Adaptive Query Optimization and using Oracle's built-in tool for automated recommendations named SQL Tuning Advisor.
  • The article is a valuable guide for developers who want to improve the performance of their Oracle databases through execution plans.

Read Full Article

like

18 Likes

source image

Dbi-Services

1M

read

407

img
dot

PostgreSQL: Indexes and casting

  • Casting one data type to another in PostgreSQL queries can affect index access.
  • Adding a cast to a query can disable index access and result in sequential scan.
  • To maintain index access when casting is required, create an index on the casted column.
  • This reminder emphasizes the importance of mindful use of casting in PostgreSQL queries.

Read Full Article

like

24 Likes

source image

Dev

1M

read

218

img
dot

Image Credit: Dev

ODyS on OCI - Oracle Dynamic Scaling (oh this is what I needed)

  • ODyS is Oracle Dynamic Scaling utility mainly used for Exadata and works with ExaCS with a plugin.
  • It is needed for cost savings on ExaCS by scaling up/down OCPU cores based on compute usage.
  • ODyS continuously monitors CPU usage and can scale cores based on custom configuration.
  • ODyS and ODCC are cost-saving utilities if implemented correctly on OCI ExaCS.

Read Full Article

like

Like

source image

Hackernoon

1M

read

187

img
dot

Image Credit: Hackernoon

The HackerNoon Newsletter: Surviving the Google SERP Data Crisis (2/2/2025)

  • The HackerNoon Newsletter delivers top quality stories straight to your inbox.
  • Highlighted articles include 'Why Scenario Planning Is An Effective Strategy Tool', '15 Databases, 15 Use Cases—Stop Using the Wrong Database for the Right Problem', 'A Basic Knowledge of Python Can Help You Build Your Own Machine Learning Model', and 'OCR Fine-Tuning: From Raw Data to Custom Paddle OCR Model'.
  • Other articles cover topics like solving the WebSocket scaling problem, tracking other traders' moves in cryptocurrency trading, analyzing the Ethereum gas limit debate, and surviving the Google SERP data crisis.
  • The HackerNoon Team invites readers to answer the greatest interview questions of all time and enjoy free reading material.

Read Full Article

like

11 Likes

source image

Cedricleruth

1M

read

444

img
dot

Navigating ERP Cloud Migrations: Lessons from Complex Enterprise Deployments

  • Cloud migration initiatives are essential to enhance scalability, cost optimization, and performance, but it comes with its unique challenges such as security risks, compliance risks & operational disruption.
  • The primary factors that need to be considered during a cloud migration are: Data Integrity & Security, Interoperability & Compatibility, Performance Optimization and Cost Management.
  • According to the 6Rs of Cloud Migration, organizations can rehost, refactor, rearchitect, rebuild, replace or retire their IT applications based on their complexity, and regulatory constraints.
  • Implementing Hybrid & Multi-Cloud approaches, organizations can keep their on-premise workloads secure while keeping the scalable applications on the cloud to avoid vendor lock-in.
  • Incorporating event-driven architecture using Apache Kafka or RabbitMQ, AWS Lambda, Kubernetes, or Azure Functions helps in the real-time processing of data, while reducing business operation delays.
  • Automation is a key to the acceleration of migration, while minimizing risks of manual errors. Organizations can implement CI/CD pipelines with Jenkins, GitHub Actions, or GitLab and enforce IaC to maintain consistency across multi-cloud environments.
  • Post-migration observability over performance degradation, security anomalies, and cost inefficiencies can be attained with Dynatrace, Prometheus, Grafana, or other platforms for real-time monitoring.
  • Enterprises can achieve seamless transitions, reduce risks, and fully realize the benefits of cloud-native infrastructure by prioritizing security, performance optimization, and cost efficiency best practices.
  • Real-world case studies show how organizations have achieved cost savings while enhancing reporting efficiency for a global corporation with Oracle Cloud Financials and transitioning HR systems with a legacy HRMS to Oracle HCM Cloud.
  • By implementing the outlined strategies and best practices, enterprises can achieve a seamless cloud transition, optimize costs, and fully realize the benefits of cloud-native infrastructure.

Read Full Article

like

26 Likes

source image

Medium

1M

read

165

img
dot

Image Credit: Medium

7 Free Resources to Master SQL and Boost Your Data Career

  • There are plenty of free resources available to learn SQL and boost your data career.
  • SQL is the standard language for interacting with relational databases and is highly in demand in the job market.
  • Here are seven free resources to help you master SQL:
  • 1. SQL: A Beginner's Guide - a comprehensive book covering basic syntax to advanced concepts.

Read Full Article

like

9 Likes

source image

Dev

2M

read

174

img
dot

Image Credit: Dev

Analyzing billing information using BigQuery

  • Google BigQuery is a powerful tool for studying big datasets effectively.
  • This tutorial explains how to import billing data into BigQuery and analyze the dataset.
  • You can perform SQL queries to retrieve and analyze billing information.
  • BigQuery provides resources for making informed decisions based on large datasets.

Read Full Article

like

10 Likes

source image

Medium

2M

read

408

img
dot

Image Credit: Medium

ChatGPT and scheduling: increase your productivity

  • ChatGPT can be used for bug identification, code understanding, and migration.
  • Regular Expressions (Regex) are powerful for searching, validation, and manipulation of strings in programming.
  • ChatGPT can help in learning SQL, constructing queries, and optimizing performance.
  • ChatGPT can generate automated test code for unit and integration testing.

Read Full Article

like

24 Likes

source image

VoltDB

2M

read

362

img
dot

Image Credit: VoltDB

Introducing the Volt Active Data Developer Edition 

  • Volt has introduced the Volt Active Data Developer Edition.
  • The Volt Active Data Developer Edition allows developers and architects to try out the Volt product suite without manually setting up a cluster.
  • It is targeted towards senior developers and architects evaluating next-gen technology for addressing latency, ACID, scale, total cost of ownership, and high availability issues.
  • The developer edition provides a representative deployment of Volt, including a three-node Volt AD cluster, a single node of Volt SP for handling incoming data streams, and a client node for running test applications.

Read Full Article

like

21 Likes

source image

TechBullion

2M

read

371

img
dot

Image Credit: TechBullion

How and why backup and recovery are necessary in Oracle: international database expert Olga Badiukova on critically important knowledge.

  • In 2024, 92% of IT leaders reported an increase in the frequency of database hacking attempts compared to 2023.
  • Human error, cyber threats, technical failures, and natural disasters are some of the reasons for conducting regular database backups.
  • Oracle Recovery Manager (RMAN), Oracle Data Guard, Oracle Flashback, and Oracle Cloud Infrastructure Backup are some of the tools for creating and managing backups offered by Oracle Database.
  • Database administrators must create a Disaster Recovery Plan to guarantee quick restoration of applications and data after an accident.
  • It is advisable to regularly test backups for integrity, store data copies on different media, and build a high-availability DB system.
  • Database-Level Recovery, Point-in-Time Recovery for Test Systems, and Minimizing Downtime are some of the methods of data recovery.
  • Despite Oracle Database's powerful tools for backup and data recovery, human errors and backup malfunctions can still cause system failures.
  • Data backup and recovery is a strategic task that requires a professional approach and regular updates, testing, and optimization.
  • Establishing an efficient backup and recovery system ensures business resilience to unexpected events that may lead to catastrophic data loss.

Read Full Article

like

22 Likes

source image

Medium

2M

read

422

img
dot

Image Credit: Medium

How to Enhance SQL Code Security and Maintainability

  • SQL procedures are a valuable tool for enhancing SQL code security and maintainability.
  • SQL procedures, also known as SQL stored procedures, are database objects that can be executed with a single call.
  • They improve database security, modularity, and code reusability.
  • SQL procedures should not be confused with SQL UDFs, as they have some similarities but serve different purposes.

Read Full Article

like

25 Likes

source image

Dbi-Services

2M

read

110

img
dot

PostgreSQL 18: Per-relation cumulative statistics for [auto]vacuum and [auto]analyze

  • PostgreSQL 18 introduces per-relation cumulative statistics for [auto]vacuum and [auto]analyze.
  • Previously, pg_stat_all_tables provided statistics about vacuum and analyze operations, but not the total time spent.
  • In PostgreSQL 18, the view pg_stat_all_tables includes columns for total vacuum and analyze time.
  • These statistics help identify relations where [auto]vacuum and [auto]analyze have consumed the most time.

Read Full Article

like

6 Likes

For uninterrupted reading, download the app