GDG also called Generation Data Group. I have explained the life cycle of GDG in the following steps:

6 Top GDG Prime Points

  1. Create GDG
  2. GDG Template
  3. Create first GDG
  4. Create new generation GDG
  5. Delete GDG

1.Create a GDG File

Create a GDG file. This JCL calling IDCAMS will do it. In this example LIMIT(15) is the number of generations you wish to create and keep. Of course, change the GDG file name to your requirements:

//STEP1 EXEC PGM=IDCAMS  
//SYSPRINT DD SYSOUT=*  
//SYSIN DD *   
DEFINE GDG(NAME(STEWART.APPLY.LOG.GDGROUP1) -   
LIMIT(15) -   
NOEMPTY -   
SCRATCH)
  /*

2. Create a GDG Model

Create a model or template for the individual generations of data sets. Here the default DCB attributes for the Apply or Capture log were used. Note that the space is rather small (5 tracks) in this example so you might want to increase it.

//STEP020 EXEC PGM=IEFBR14 
//GDGMODEL DD DSN=STEWART.APPLY.LOG.GDMODEL1, 
// DISP=(NEW,KEEP,DELETE), 
// UNIT=SYSDA, 
// SPACE=(TRK,5), 
// DCB=(LRECL=1024,RECFM=VB,BLKSIZE=6144,DSORG=PS)

Hard skills get you the interview, soft skills get you the job.

Anonymous

3. Generate GDG

Advertisements

As a test, you can create your first generation of data sets by running this JCL example. Each time you run this step a new generation of data set will be created in your group.

//STEP010 EXEC PGM=IEBGENER 
//SYSPRINT DD SYSOUT=* 
//SYSIN DD DUMMY 
//SYSUT1 DD *  
TEST DATA LINE 1  TEST DATA LINE 2 
/* 
//SYSUT2 DD DSN=STEWART.APPLY.LOG.GDGROUP1(+1),
// DISP=(NEW,CATLG,DELETE),
// SPACE=(TRK,5), 
// DCB=STEWART.APPLY.LOG.GDMODEL1

After you have created several generations of log files an ISPF 3.4 display of your GDG would look like this:

STEWART.APPLY.LOG.GDGROUP1
STEWART.APPLY.LOG.GDGROUP1.G0003V00
STEWART.APPLY.LOG.GDGROUP1.G0004V00
STEWART.APPLY.LOG.GDGROUP1.G0005V00
.
.
.
STEWART.APPLY.LOG.GDMODEL1

The ISPF 3.4 screen header also tells you how many generations of datasets you have.

For example, Data Sets Matching STEWART.APPLY.LOG.GDGROUP1.G* Row 1 of 15. There are 15 generations of the dataset.

4. New Generation GDG

Because the Apply or Capture log file is not referenced by a DD statement you must use IEBGENER to copy your app.log or cap.log file to the GDG. Place this step ahead of your Apply or Capture JCL.

This step creates a new generation of data in your group. Consider using the LOGREUSE=N parm when you start Capture or Apply so that each generation of the log is unique to the specific instance when Capture or Apply was run.

//COPYLOG EXEC PGM=IEBGENER 
//SYSPRINT DD SYSOUT=*
//SYSUT1 DD DSN=STEWART.DSN9.STEVE1.APP.LOG,
// DISP=SHR
//SYSUT2 DD DSN=STEWART.APPLY.LOG.GDGROUP1(+1),
// DISP=(NEW,CATLG,DELETE),
// SPACE=(TRK,5),
// DCB=STEWART.APPLY.LOG.GDMODEL1
//SYSIN DD DUMMY //SYSOUT DD SYSOUT=*
//SYSUDUMP DD SYSOUT=*

5. Delete GDG

Advertisements

If you need to delete your GDG, delete the individual data sets (G0003V00, G0004V00, etc.) and then run this IDCAM job to delete the GDG.

//STEP010 EXEC PGM=IDCAMS 
//SYSPRINT DD SYSOUT=*
//SYSIN DD * DELETE (STEWART.APPLY.LOG.GDGROUP1) GDG FORCE
/*

Finally, this example of a GDG was created with 15 generations of data sets using the LIMIT(15) parameter. If you wish to change the number of generations run this IDCAMS alter example where the number of generations is increased to 50. Use the GDG name in the ALTER statement that you created from Step 1.

//STEP010 EXEC PGM=IDCAMS 
//SYSPRINT DD SYSOUT=*
//SYSIN DD * ALTER STEWART.APPLY.LOG.GDGROUP1 LIMIT(50)
/*

The maximum value for LIMIT is 255.

Ref:IBM

LATEST POSTS

Secure S3 File Upload Using API Gateway, Lambda & PostgreSQL (Complete AWS Architecture Guide

Modern applications often allow users to upload files—documents, invoices, images, or datasets. But a production-grade upload pipeline must be secure, scalable, and well-organized. In this article, we will build a complete end-to-end architecture where: We will implement this using Amazon API Gateway, AWS Lambda, PostgreSQL, and Amazon S3. This architecture is widely used in cloud-native…

AI Agents in Data Engineering: Everything You Need to Know

AI agents are revolutionizing data engineering by automating tasks such as monitoring pipelines, generating SQL queries, and ensuring data quality. They enhance productivity, speed up troubleshooting, and improve data accessibility for users. While offering significant advantages, AI agents also face challenges in security, accuracy, and integration with existing systems.

The End-to-End AI Stack – A Real Guide for Developers to Code, Create, and Execute

Artificial Intelligence tools are on the rise, from writing assistants to coding helpers and automation platforms. However, many professionals struggle to compare these tools effectively. This is where the AI Stack becomes important. Modern AI tools like ChatGPT, NotebookLM, and Antigravity serve different purposes, and understanding their roles helps in: Layer 1: Conversational AI (Thinking…

10 Workplace Communication Speaking Exercises to Improve Fluency at Work

Strong workplace communication is one of the most valuable professional skills today.Whether you’re giving project updates, speaking to clients, or collaborating with teams — the ability to speak clearly and confidently can set you apart. However, many professionals struggle with: One of the best ways to improve is through chunking and pausing. Chunking helps you:✔…

Something went wrong. Please refresh the page and/or try again.