Here are the top performed and most viewed blog posts in 2021. The topics are SQL, Python, VSAM and AWS.
How to Use Multiple Like Conditions in SQL
Here is an example on how to use multiple LIKE conditions in WHERE clause of SQL.
SQL where clause fetches records quickly when you give conditions correctly. The conditions should be indexed table columns. And, many a time, you need to filter records using like conditions.
SQL Query Examples on Multiple WHERE Conditions
Multiple conditions, how to give in the SQL WHERE Clause, I have covered in this post. Those are IN, LT, GT, =, AND, OR, and CASE. It takes more CPU time, If the WHERE condition is not proper, to fetch rows – since more rows.
15 Top AWS IAM Service Interview Questions
The identity and access management (IAM) service provide access for users (Entities). Below are the useful interview questions so that you can understand the IAM service quickly.
COBOL VSAM Files READ with START Logic
Here’s sample COBOL program that shows how to use READ, WRITE and READ NEXT statements. The READ statement is to get record randomly. The READ NEXT to read VSAM file till end of the file
12 Tricky Python Coding Interview Questions
Among the leading computer programming languages, Python takes the first place. In this post, I am sharing the 12 Python frequently asked Coding Interview Questions on the basics of Python Language.
Related posts
-
Ingesting Data from AWS S3 into Databricks with Auto Loader: Building a Medallion Architecture
In this blog post, we will explore how to seamlessly ingest data from Amazon S3 into Databricks using Auto Loader. We will also discuss performing transformations on the data and implementing a Medallion architecture for better management and processing of large datasets. What is the Medallion Architecture? The Medallion architecture is a data modeling pattern…
-
Exploring Databricks Unity Catalog – System Tables and Information _Schema: Use Cases
Databricks Unity Catalog offers a unified governance solution for managing structured data across the Databricks Lakehouse platform. It enables organizations to implement fine-grained access controls, auditing, and monitoring, enhancing data governance and compliance. Key functionalities include centralized metadata management, data discovery, dynamic reporting, and data lineage tracking, optimizing performance and collaboration.






