Tuesday, June 17, 2014

Get Involved. Network. Share. Who knows where it will take you

My favorite Oracle conference starts soon. This year, KScope14 is about much more than going to all the great sessions presented by a phenomenal group of experts.  This time it's about involvement.  What could I do over the course of the year to contribute to an organization that has been instrumental in the development of my Oracle foundation.


Here is my story.

Year 1: - KScope12
Thanks to meeting Barbara Morris at an APEX meetup, I won a drawing at OpenWorld for a FREE registration to KScope12 in San Antonio, TX.  It was hot, but it was a blast. It was here I met great people like: +Kent Graziano aka Oracle Data Warrior that DataVault guy, +Martin Giffy D'Souza APEX Guru extraordinaire, +Monty Latiolais ODTUG President, the infamous Chet aka +ORACLENERD he's a BI guy, and +Jeff Smith aka thatjeffsmith. He actually works for Oracle. Among other things makes SQL Developer kick ass and keeps us entertained on twitter.

Of all the sessions I attended that year, there was one session that resonated with me:
APEX: A CEO's Perspective by Cameron Mahbubian, ClariFit

I walked away from that session with a new purpose; building my reputation. My career goals demanded that I do that.  I dusted off that old Twitter account +dbcapoeira and started a blog about my experiences in the Oracle space.  You can find it here: blog.dbaontap.com  My small group of "oracle people" started following me and re-tweeting my blog posts.  I started following names of people I didn't even know such as +Kellyn Pot'Vin aka dbakevlar and +Leighton Nelson.  Why?  Because everyone else was, and their blog posts got me thinking more about what could be done with the Oracle technology.

Year 2: - KScope13
Continuing to remain active in social media, networking events, and webinars my circle of friends continued to grow.  Met this firecracker of an Essbase/Hyperion/EPM expert +Sarah at an Atlanta Meetup. She moved, but we still see each other at conferences and communicate via Twitter.  Plus she can throw one heck of a party.  I was back at KScope13 in New Orleans! While I am thankful that work paid for this conference, I would come out of pocket.  It's that good!  New and renewed friendships. +Galo Balda-Andrade reminded me of the power of Regular Expressions, and +Leighton Nelson brings the hotel network to a crawl because he's downloading the just released Database 12c.  He didn't just do a live demo with that did he?  All signs point to yes. Also met +Bobby Curtis aka dbasolved, +oracle-ace living in my hometown of Atlanta. I also ran for the +ODTUG Board of Directors.  I didn't win, but thank Jeff for the nomination and Chet posted a great article of support right here. It's those things that make you want to do more. So I enrolled in school to get a degree in Database Administration and Linux Administration.

Year 3: - KScope14
This year is a bit different. Every year I strive to do more and be more.  Kscope14 sees me participating on one of the Abstract Review Committees headed by the energetic +Amy Caldwell.  She works at Chick-fil-A ... in Atlanta.  Congrats for winning the #DineWMonty contest!  Was anticipating an opportunity to reconnect with my +Oracle beginnings at the youth Hackathon. I joined a committee or two, seeing how things work from a different angle.  I got brave and submitted an abstract to KScope14 and yep, I'm presenting this year on Database Migration using +Oracle SQL Developer. I was so confident after this, that I submitted something to for OOW14 and it too was accepted.  One of the most rewarding things, because it recognition by your peers, was being nominated for and being awarded the new Oracle Ace Associate designation.

I am now at the beginning. A new path of service to the community that embraced and taught me so much. I can't wait to get to Seattle for KScope14 and do my part.  The countdown has started as my +Delta Mobile App is telling I have one day until check-in.  For anyone doing the math, that means I'll be there on Friday.  Let's connect when you get there.

-DB

Sunday, June 8, 2014

Using Shell Scripts as Concurrent Programs in EBS

Recently, I started looking at way to address a scheduling problem when moving files associated with our business partners. We currently have a three step process:

  1. Schedule a Concurrent Program to produce a file
  2. Set up a cronjob to schedule a shell script to move and archive that file on a SAMBA server for our Business Units to access (we are a Windows shop)
  3. Using Windows Scheduler, launch a C# program that will move the file from the SAMBA server to an FTP site where the business partner can "pick up" the file.


To receive files from our Business Partners, we simply reverse the process.  As you can imagine, this can be a scheduling nightmare.  If any of these activities gets out of sync, the entire process fails, and technical resources need to manually move the files.


In an effort to address this situation, we are going to rewrite the shell scripts to perform all the file movement activities and register it as a part of a concurrent request set. This will eliminate the scheduling issue, by only running the shell script once the file has been generated.  An added benefit is completion time for the process.  Now that the two processes are tied together (via request set), there is not gap or dead time between the tasks.

Let's get to it:

First I'm going to create a few shell scripts -- in this case, the values needed to FTP a file

FILE #1
#!/bin/ksh 
# File name: ftp_setup.sh
# Description: Define values for remote ftp server
################################ REMOTE FTP ###########################
export blog_host='ftp01.blog.com'
export blog_user='blog12345'
export blog_pwd='DE45rtY7'
export blog_ftp_target='upload'
export blog_file='filename'
export blog_email='addy1@emailaddy.com
########################################################################


FILE #2
#!/bin/sh
# File name: ftp_interface.sh (extension changed to prog for EBS concurrent program)
# Description: Define values for remote ftp server
################################ EXECUTION SCRIPT######################
. /home/dbryant/Documents/ftp_setup.sh
echo "loading values"
if [ $5== "Yes" ]; then  #parameter passed in from Concurrent Program
/home/dbryant/Documents/do_ftp.sh $blog_host $blog_user $blog_pwd $blog_ftp_target $blog_file $blog_email
elif [ $5 == "No" ]; then
echo "No FTP!"
else 
echo "Invalid option please enter (Yes or No)"
fi
echo "End"


FILE #3
#!/bin/ksh
#---------------------------------------------------
# Script Name: DO_FTP.prog
# Script Purpose: FTP file to business
# Script Dependencies: file exists
#---------------------------------------------------

#FTP script
ftp -n $1 <
quote USER $2
quote PASS $3
cd $4
put $5
quit
END_SCRIPT

echo "The file transfer was successful" | mailx -s "FTP File Transfer Notification" $6

exit 0

Once I have my scripts written and tested, I need to put them in the $XBOL_TOP/bin directory.  While only the one file you are going to execute as part of the request set needs to be in this directory, it just makes it easier.

1. You will need to change the file extension of the script you are registering to .prog.  In this case FILE #2: ftp_interface.sh will be ftp_interface.prog. 

2. Next, change the permissions on the file to 755.

3. Create a symbolic link to the script.

4. You can now register your program as am executable in EBS with HOST as the Execution Method.

You can now register this as a Concurrent Program and/or include it in a Request Set. There are plenty of blog post and documentation for that, so have at it.

Something to note: If you decide to pass a parameter to you shell script, you will need to start with $5.  The first four are reserved for the USERID, REQUEST,_ID, RESP_ID, RESP_APPL_ID.



Have Fun!!



Wednesday, May 14, 2014

XML/BI Publisher in Oracle EBS - Bye Bye Discoverer


All good things must come to an end.  It has been a long run with plenty of ups and downs.  Getting Discoverer rolled out was one of my first tasks as I embarked upon my +Oracle E-Business Suite journey.  Alas, we are saying our final goodbyes to Discoverer.  This however opens new doors to address our reporting solutions.  Over the past couple of years, we have installed Hyperion and OBIEE.  These will address some of our needs, but there will still be the requirement for those day to day reports currently run from Discoverer.

This is why we are implementing BI Publisher as the new tool. Why did we decide to go this route?  Anyone familiar with EBS, in this case 11i, knows that the Standard Reports for the most part look like they were designed to printed on a dot matrix printer and delivered by interoffice courier from the depths of the IT Department.  This why Discoverer was so popular; we could create reports and Finance and HR could export them to excel and "manipulate" them.  Now back to why BI Publisher.

Requirements

  • Ability to export to Excel - Just like Discoverer
  • Recognize EBS organizational security - HR data in particular
  • Authenticate against EBS native login  - No LDAP or any semblance of SSO
  • Familiar UI - Minimize Change Managment
We looked at several tools before making the decision. While all had their benefits, BI Publisher met the requirements best. An additional benefit of BI Publisher, is that we can now schedule and deliver these reports on a predetermined basis.  The biggest thing we give up with this approach is that our Functional Users lose the ability to create their own reports, and depending upon certain implementation decisions, they will be limited to the reporting instance refresh cycle.  More on that later.  If there is a need to get actual 'real time' data, we can run the report(s) using +Oracle SQL Developer and provide upon request.

Now that we have selected BI Publisher, we need to do a few things.  First we need to have our Functional Users identify the Discoverer and Standard Oracle reports that are critical to their LOB.  Fortunately, about 90% of our Discoverer reports are custom SQL, so we don't have to reverse engineer many of those delivered reports. Next we identify the Standard Reports that have XML templates available which the users might not have been aware of.  Having all this in hand we can now move on to the next part of the process.  

Setting up the Environment

Reporting against the transactional database can cause performance issues, so we will create a reporting database instance and create a DBLINK between it and the Production database.  This will the database only and it will be refreshed on an agreed upon interval.  Active Data Guard appears to be a good solution here.  I encourage any contributions to this discussion.  From the application perspective, a separate Concurrent Manager will be created to handle the BI Publisher report requests.  This will prevent long running reports from queuing up in the existing Concurrent Managers and possibly causing performance issues with other processes.  The Output Post Processor (OPP) will apply the XML template and produce the final output.  Finally, you will want to install Oracle BI Publisher Desktop on your computer.  This will be required to create your template(s).  The application can be downloaded here.

Creating the XML Template

Once this reporting instance and concurrent manager is created, we can begin to migrate the custom SQL from Discoverer to BI Publisher.  The easiest way to do this is launch Discoverer Administrator and get the copy the SQL from the Business Area.  You will obviously need this later.  Things to be aware of:
  • If your Discoverer reports are calling function from within Discoverer, you will need to make note of them as the SQL will need to be edited to include the function calls.
  • Conditions will need to converted to parameters in or added the WHERE clause
  • In my opinion, sorting is optional
In a nutshell, any Discoverer features used need to be documented and applied appropriately whether it be editing the SQL or when you configure the concurrent program.  Lastly, since the Concurrent Program will be run from the Production database, the underlying SQL will need to be modified with the DBLINK.  Obviously you'll be looking at code similar to this: SELECT * FROM hr.per_all_people_f@reportingDB;  Now that we've completed editing the SQL, the goal is to register it in EBS as a concurrent  program, run the concurrent request and select the output type as XML.  This will produce the XML template needed to create the BI Publisher report.

Create Your Report

Install Oracle BI Publisher Desktop you downloaded earlier.  This adds a toolbar item to your Ms Office applications which is how we create the report templates.  In the screenshot below, we see this in Word:









At this point we have everything necessary to create the report in BI Publisher.  Here is a tutorial on:
Creating Reports in Oracle E-Business Suite Using XML Publisher to get you started.  


Tuesday, April 29, 2014

Oracle EBS: Life Events, Life_Event_Occurred_Date (lf_evt_ocrd_dt) and last_update_date

Whenever we talk about Date Tracking in EBS, we always refer to the effective_start_date and effective_end_date fields.  Does code like this look familiar?

SELECT *
FROM per_all_people_f papf
WHERE sysdate between papf.effective_start_date and papf.effective_end_date

While this great for making sure you have the record on the date you are looking for, I find it to be only part of the solution.  We also might look at the things like the lf_evt_ocrd_dt.  This tells me when the Life Event in EBS occurred.

There are often times however when this date will cause some problems.  This typically occurs when something is backdated.  You may be looking for an action to occur on a particular date.  If your functional users are late to the game, chances are, a few days could pass by and that date has passed. How do you address that you ask?  Look at the Last_Update_Date field.  This field will tell when the record was update no matter when the lf_evt_ocrd_dt date is.

Take a look at the SQL statement below.  This is comes from an Oracle EBS Alert to notify other team that a new employee has been hired and they need a network log on account.  My initial implementation of this Alert assumed that the Life Events were being entered on the actual date, and running daily would email everybody about the new hire. Take note of the highlighted line of code below. WRONG!!!  This of course resulted in a number of New Hires getting missed by the alert.  By replacing that line with:  and trunc(sysdate) = trunc(bplfp.last_update_date) I was able to capture the date the row was updated even if the NEW HIRE life event was back dated.  Here is the output of this query with the two dates side by side.



Notice the varying dates of the lf_evt_ocrd_dt field versus the consistency of the last_update_date.
So if you find that sometimes you are missing data in your reports, alerts, or etc, take a look at the Last_Update_Date field.  You might just find what you are looking for.


SELECT DISTINCT (papf.employee_number)
, papf.full_name
, papf.email_address
, substr (hpf.name,10)
, haou.name
, bplfp.person_id
, bplfp.ler_id
, bplfp.lf_evt_ocrd_dt
, bplfp.last_update_date
, blf.name

FROM ben_ptnl_ler_for_per bplfp
, per_all_people_f papf
, ben_ler_f blf
, per_all_assignments_f paaf
, hr_positions_f hpf
, hr.hr_all_organization_units haou

WHERE 1=1
and papf.person_id=paaf.person_id
and hpf.position_id=paaf.position_id
and haou.organization_id=hpf.organization_id
and bplfp.person_id = papf.person_id
and bplfp.ler_id = blf.ler_id(+)
and sysdate between papf.effective_start_date and papf.effective_end_date
and trunc(sysdate) between trunc(paaf.effective_start_date) and trunc(paaf.effective_end_date)
and trunc(sysdate) between trunc(hpf.effective_start_date) and trunc(hpf.effective_end_date)
and trunc(sysdate) = trunc(bplfp.lf_evt_ocrd_dt)
and blf.name = 'NEW HIRE'

The code does work, so if you need it, you are free to use it.

Enjoy!

Wednesday, April 2, 2014

Oracle XML Gateway: How I was Forced to Eat Ice Cream

If you follow me on LinkedIn, I recently posted this:















I did this because I found myself guilty doing that very thing.  Wanting to do something the way it has always been done.  It would have been easy.  We had all the scripts and processes worked out.  It was a no brainer.  Our vendor however, wanted to send us invoices through the Oracle Supplier Network (OSN).  This of course meant that we needed to configure XML Gateway for yep, EBS 11i (no jokes please).  Why would we do that?  Our current process worked just fine:

  1. Vendor/Partner uploads a flat file to the sFTP site
  2. A scheduled script would move the file from the sFTP site to the SAMBA share
  3. Another script would archive the file and move it from the SAMBA share to the EBS file system where is was archived again
  4. A concurrent program would run to "process" that file and when everything was done, the log files were checked to see if any errors occurred.
Did I mention that at points throughout the process email notifications are flying back and forth letting us know at each step whether it was successful or not?  Simple huh?

Just think ... that process was repeated for each business partner we needed to exchange data with.  But we had that down pat, why do something different?  "We've always done it this way."  I was going against my very nature.  Now there was good reason to want this.  Our functional area made a few wrong assumptions:

  • XML Gateway was already configured
  • OSN was already configured
  • Vendors were already set up as trading partners
This resulted in a project timeline that could not be met using the above features.  More on including technical resources in your planning sessions in another post. :(  Anyway, that was the only reason to go with the tried and true method.


I got a lot of push back, so I looked at this XML Gateway, and once I understood it, I saw the potential.  If for whatever reason, you don't know about it, click here. Hope Oracle doesn't get mad about that.  In a nutshell, XML Gateway is a set of services that allow you to subscribe to inbound events and consume XML data for processing.  It takes advantage of workflows and there is even a XML Gateway map builder that will allow you to remap messages to match any standard.  The map builder isn't the best looking tool.  It looks like workflow builder's cousin, but it gets the job done.

What does this mean for me as a solutions provider?

  1. Standardization on how data is delivered
  2. More solutions offering (the other process isn't going away anytime soon)
  3. Cleaner integration into EBS
  4. Elimination of a multi-step process and multiple points of failure
  5. Deliver streamlined and repeatable processes
  6. Simpler modification process for system upgrades ...  R12.2, here we come!!!
These benefits completely outweigh the old method of doing this, and I look forward to the next opportunity to implement and XML Gateway solution.  And that, my friends, is how I was forced to eat ice cream.

Later I will show you exactly what we did.

-DB

Friday, March 21, 2014

WCLK's Signature Serenade (7pm to 9pm EST)

To my social media friends and  family.
I am posting a letter from my friend Craig and if you are a fan of Jazz music, please support him at #WCLK Radio.

To all my friends and family, 

Once again, I’m channeling my inner do-gooder as well as my radio personality.  This Wednesday, March 26th, I will guest host WCLK's Signature Serenade (7pm to 9pm EST).  I'll play my favorite Jazz tracks and discuss their impact they've had on me. 

I had so much fun last time and we raised a bunch of money, they asked me back again.  It will air during the station's annual membership drive and that is why I'm coming to YOU.

I'm trying to help raise awareness and money for the station, which will celebrate its 40th anniversary in April. I encourage you to learn about the Clark-Atlanta University run station and consider helping their ongoing support of Jazz music and education. For info about them, visit WLCK.  

What I'm asking is this:

a) Tune in! Again, it's on 3/26, from 7 PM - 9 PM EST; 91.9 on your FM dial (or listen live online at Listen Live to WCLK)

But Most Importantly

b) Call in (404-880-8807)  during my show and become a member and/or donate - it is tax deductible!  Even if it's just $5, every dollar counts.  My goal is to raise $10,000 during my show and online during the pledge.  If just 500 of you just give $20, I meet my goal and we support an amazing radio station.  Or if you just want to stroke a check for the entire amount, I won’t get in your way. 

c) If you can't call (404-880-8807) during my time slot, you can still get involved ahead of time :

Make a secure pledge online at :  Donate to WCLK with your Visa, Master Card, American Express, Discover Card or Visa Debit Card. Make sure you put my name in the comments so I get credit! We have a friendly competition to see who raises the most money.

d) share this with your networks 

Why Making A Financial Commitment Is Important To WCLK
·  Members are Jazz 91.9 WCLK’s single most important source of funding
·  When you contribute to 91.9 WCLK, you share the joy of listening to jazz music…and with those who may be hearing some of the artists for the first time.

I thank you for your support in advance.  And did I mention this is tax-deductible in addition to supporting a great radio station. Speaking of mentioning - if you donate on-line or on the phone, please add my name in the comments or to the person taking your pledge.  

Craig M. Garrett

Tuesday, February 25, 2014

Auto End Date Separated Employees

A little while back, there was some concern that employees were being terminated in #Oracle E-Business Suite, but there was no process to ensure that those former employees couldn't log back into EBS.  This particular environment did not leverage any SSO technologies to manage user access, so each System Administrator was responsible for managing that process.  This can become quite cumbersome as both Responsibilities and user accounts must be End Dated in EBS. So the challenge was to automate this process to relieve the staff of the responsibility.

I launched my favorite development tool, SQL Developer, and wrote a PL/SQL procedure building 2 cursors to perform the following:

Cursor I
  1. Identify all separated employees.  In this environment they are considered  "Ex-Employee"
  2. Collect all Responsibilities assigned to these former employees.  We needed to capture the Responsibility Key and the Short Name to pass into the API.

Cursor II
  1. Just gives me a second copy of the separated users for End Dating the FND User record

Once we collected this information, we were ready to loop through these cursors.  The first one End Dates each Responsibility and the second loop End Dates the FND User record.  Interestingly enough, I tried multiple ways of combining these steps, even swapping the order.  Ultimately, processing the records in the sequence below worked.

The final step was to create and schedule this as a concurrent program in EBS and produce a log file for the process owner(s) to validate against.

Take a look, and if you can borrow from it, be my guest.

(edit) Here is the Package Spec: (02-Apr-2014)

create or replace PACKAGE        ENDDATE_TERM_USERS      AUTHID CURRENT_USER
 -- version 1.0
/*******************************************************************************
Name: CATS_ENDDATE_TERM_USERS
Description - End Date ALL Responsibilities and the FND Users account of
terminated users and deceased survivors.  All users with employee type ofex-employee will be end dated.  
This process will run daily

.-------------------------------------------------------------------------
Written by          Date                   Description      
---------------- ----------- -------- -----------------------------------
Danny Bryant     12-MAR-2012   New 
*******************************************************************************/
IS
PROCEDURE MAIN(               
X_ERRBUF          OUT  VARCHAR2             
 ,X_RETCODE         OUT  VARCHAR2              );

ENDDATE_TERM_USERS;


--------------------------------------------------------

Package Body

create or replace PACKAGE BODY        ENDDATE_TERM_USERS
AS
-- Version 1.0
/*******************************************************************************
Description - End Date ALL Responsibilities and the FND Users account of
terminated users and deceased survivors.  All users with employee type of
ex-employee will be end dated.
*******************************************************************************/
PROCEDURE MAIN(
                X_ERRBUF          OUT  VARCHAR2
               ,X_RETCODE         OUT  VARCHAR2
               )
IS

--DECLARE

L_EX_EMP_TYPE   NUMBER DEFAULT 1123;    -- Ex-employee
L_FND_COUNT         NUMBER := 0;
L_RESP_COUNT        NUMBER := 0;

-- cursor for responsibilities to end-date

CURSOR RESP_RECORDS
IS
SELECT 
   DISTINCT PAPF.EMPLOYEE_NUMBER AS EMP_NO 
  ,FR.RESPONSIBILITY_KEY AS RESP_KEY
  ,FA.APPLICATION_SHORT_NAME as APP_SHORT_NM
FROM PER_ALL_PEOPLE_F PAPF
  ,FND_APPLICATION FA
  ,FND_RESPONSIBILITY FR
  ,FND_USER_RESP_GROUPS_ALL FURGA
  ,FND_USER FU  
WHERE PAPF.PERSON_ID = FU.EMPLOYEE_ID 
AND PAPF.PERSON_TYPE_ID = '1123'--L_EX_EMP_TYPE  
AND FU.END_DATE IS NULL 
AND TRUNC(SYSDATE) BETWEEN TRUNC(PAPF.EFFECTIVE_START_DATE) AND TRUNC(PAPF.EFFECTIVE_END_DATE)  -- Handle the date tracking feature in EBS
AND FU.USER_ID = FURGA.USER_ID
AND FR.RESPONSIBILITY_ID = FURGA.RESPONSIBILITY_ID
AND FA.APPLICATION_ID = FR.APPLICATION_ID
AND FU.USER_NAME = PAPF.EMPLOYEE_NUMBER
AND FURGA.END_DATE IS NULL
ORDER BY 1;
--
--cursor for fnd_user records to end-date. 
CURSOR FND_RECORDS
IS
SELECT
  DISTINCT(PAPF.EMPLOYEE_NUMBER) as EMP_NO,
  PAPF.FULL_NAME FULL_NAME  
FROM PER_ALL_PEOPLE_F PAPF
  ,FND_USER FU  
WHERE PAPF.PERSON_ID = FU.EMPLOYEE_ID
AND FU.USER_NAME = PAPF.EMPLOYEE_NUMBER
AND PAPF.PERSON_TYPE_ID = '1123'--L_EX_EMP_TYPE
AND FU.END_DATE IS NULL
AND TRUNC(SYSDATE) BETWEEN TRUNC(PAPF.EFFECTIVE_START_DATE) AND TRUNC(PAPF.EFFECTIVE_END_DATE)
ORDER BY 1;
--
BEGIN
--
FOR R_REC IN RESP_RECORDS
  LOOP
    BEGIN
      fnd_user_pkg.delresp(      
                  username       => R_REC.EMP_NO
                 ,resp_app       => R_REC.APP_SHORT_NM
                 ,RESP_KEY       => R_REC.RESP_KEY
                 ,security_group => 'STANDARD');
    END;
    L_RESP_COUNT := L_RESP_COUNT + 1;
    FND_FILE.PUT_LINE (FND_FILE.OUTPUT, 'Emp Number: ' || R_REC.EMP_NO || ' End dating Responsibility: ' || R_REC.RESP_KEY);   
  END LOOP;
  FND_FILE.PUT_LINE (FND_FILE.OUTPUT, 'End Dated: ' || L_RESP_COUNT || ' Responsibilities.'); 
  
  FOR F_REC IN FND_RECORDS
  LOOP
  BEGIN
     FND_USER_PKG.UPDATEUSER (
           X_USER_NAME                  => F_REC.EMP_NO
          ,X_OWNER                     => 'SEED'
          ,X_END_DATE                  => SYSDATE);
  END;
  L_FND_COUNT := L_FND_COUNT + 1;
  FND_FILE.PUT_LINE (FND_FILE.OUTPUT, 'Employee Name: ' || F_REC.FULL_NAME || ' Employee Number: ' || F_REC.EMP_NO || ' FND End Dated on: ' || SYSDATE);   
END LOOP;
FND_FILE.PUT_LINE (FND_FILE.OUTPUT, 'End Dated: ' || L_FND_COUNT || ' FND User Accounts.'); 
COMMIT;
END MAIN;
END ENDDATE_TERM_USERS;


Enjoy