• Skip to primary navigation
  • Skip to main content
  • Skip to footer

Storm Petrel, LLC

  • Storm Petrel LLC
  • Products
    • All Products
    • Tempest-GEMS
    • Tempest-Time | Time & Expense Tracking
  • Services
    • All Services
    • GSA
    • Customer Service
  • Blogs
    • Grants/Tempest-GEMS Blogs
    • Oracle Blogs
  • Privacy Policy
  • About

API

How to Query Oracle Font APEX

16 June 2021 by Christina Moore

Abstract

This is an article about using classic Oracle database tools to query Font APEX as a dataset. This article is also about failure, my failure(s) in achieving this goal. If you don’t care about the journey, the frustrations, the cool news tools I explored, just click to the RESTful link to get FONT Apex as JSON. (link https://g2235c8c2b07256-cmoore.adb.us-ashburn-1.oraclecloudapps.com/ords/cm/font/apex)

I did explore Oracle 21C, Multilingual Engine (MLE), Javascript, JS variables, and a whole lot of other fun stuff for days and days before writing too much code to solve a simple problem that I had in 2017. In short, I failed. Ok, sure I got the data I wanted but I failed at elegant, efficient, pretty. I may have learned something including that writing about failure might be fun?

Challenge

Please go do this then show me how easy and elegant it ought to be. Then ping me back with a positive and warm vibe on the Twitter (@cmoore_sp). Hey trolls, I already admitted I flopped and I got stuff to learn.

History

During the summer of 2017, our team supported a Toronto-based firm with custom application to help with quality assurance work. I thought it would be cool, when during setup and management of the application that the client could pick their own icons for various tasks that they setup in their QA tracking. Oracle is a terrific database. APEX, or Application Express, is a low-code rapid development tool. Sadly, that moniker under-represents the power of APEX. Our firm has managed billions of dollars with APEX and 400,000 PDF documents within a single application with 2,000 users engaging in daily tasks using both English and Spanish. APEX has serious capabilities.

I wasn’t successful at finding a view or a synonym with the information. When I digested the Universal Theme sample app, I got little more information there. I wrote Christian Neumueller (Twitter: @chrisNeumueller) with the Oracle APEX team in July of 2017 with the question. He wrote back stating that there was nothing “server-side” – meaning nothing in the database itself. Font APEX is thus buried within the labyrinth of CSS and JS.

I extracted the data I needed with expedience and zero grace – and decided I needed to explore further. Since then, I’ve deployed into Hurricane Harvey in Texas, then deployed to Puerto Rico for two hurricanes (Irma and Maria). For the last 16 months, our team has been supporting various COVID funding initiative with our software – y’know a global pandemic.

Four year later, I still want an ability to query APEX font choices with classic database tools. As I returned to the question, I also recognize I have absolutely no NEED to do this stupid task. I sure have spent days and days trying.

Objectives

I started this project in 2017, then again in May of 2021 with a few objectives.

  1. Do something cool that results in code worthy of praise, recognition, and challenges the boundaries of my knowledge – and may teach someone else something.
  2. Learning something cool along the way myself – push myself
  3. Publish both code and Font APEX as a data that can be refreshed with a click.

Successes

  • I pushed my knowledge and did learn stuff.
  • I pushed myself into 21C and JavaScript and Multilingual Engine (MLE).
  • Published a RESTFul link for Font APEX.

But I really consider my effort a failure, a flop, and a frustrating fizzle into the desire to publish even if that means recognizing I needed help long before I gave up.

Failure

Following the series of API articles and efforts I have published in the recent years, I thought I put up an APEX application with an API that fetched Font APEX dynamically making it available in an Oracle table and/or a JSON dataset.

Playing Along At Home

If you are going to follow along at home, please install the Universal Theme Sample Application on your own system. You can do this from the APEX main application page, click “Create>”, then click the Productivity App card. From there, install the “Universal Theme Sample Application”. It takes only a few minutes and as you explore and hack, you won’t be on someone else’s server whackin’ away. It is just more respectful to your neighbors and friends. We have an APEX workspace we call “Backyard” in which we play.

APEX Team Answer

In July of 2017, I got this in an email from the APEX team at Oracle:

You could also write some script to generate records from images/themes/theme_42/1.1/js/demo/icons.js, or use that (probably just the global JavaScript variable apexIcons) in your app’s UI to generate the “pick an icon” dialog.

APEX Team at Oracle

I haven’t a clue what I did 3 hurricanes and 1 pandemic ago. I do know we delivered an application with a table that included APEX fonts.

I acknowledge I am a JSI – JavaScript Idiot. Soy idiota. This handicap came about in the mid-1980 when I was learning programming. I resisted the computer science degree which focused on big iron mainframes. Instead, I learned Borland’s Turbo Pascal in independent study. Clancy and Cooper’s book “Oh, Pascal!” sits proudly on myself behind me. The syntax and structures I learned in the early 1980s subsequently became the foundations for our PL/SQL. I’d learned BASIC in high school on a PDP-11. I am clumsy and awkward with object-oriented languages.

We all must learn to celebrate diversity, mustn’t we.

I knew Christian left me a serious clue in his email. The clue is still there, ignored by me.

Step 1 – API

I had a plan! I can readily find the icons.js. I thought I could automate that a bit. The May 2021 (APEX 20) path, if you have the Universal Theme Sample Application installed, is something like:

/i/themes/theme_42/1.6/js/demo/icons.js

In July of 2017, it was:

/i/themes/theme_42/1.1/js/demo/icons.js

I see a pattern. We have a global substitution to get to the ‘/i/’ in APEX (#IMAGE_PREFIX#).

With a too-quick look I see squiggle brackets, square braces, and assume immediately I am looking at JSON. Sweet.

I pull out my trusty API template (see article: https://storm-petrel.com/2021/05/21/oracle-plsql-api-template/). With in a few minutes, I have pulled the data into my API_STAGING table as an HTTP Response (a clob).

Fail

Several of you playing along at home already know what I did not know. The icons.js is not JSON but a JavaScript variable or JS Object. I don’t know how to parse that. There has to be a way! I can admit I wrote some bad code in PL/SQL that tried – and failed.

Step 2 – JS Object to JSON

With the minor setback of not knowing much about JS variable, I rapidly developed a new plan.

I must learn how to Stringify an JavaScript object into my familiar JSON. Then I’ll win and move forward. I can now Stringify safely because I learned this last month.

<script>  
var icons= JSON.stringify(apexIcons);
document.write(icons);
</script>

I read a blog post by Joel Kallman that he posted in November of 2018 discussing Oracle Database Multilingual Engine (MLE) that is now available to us in 2021, as in it is in Oracle release 21c.

Great, toss my JSON.stringify into familiar PL/SQL code using Oracle’s MLE.

Now I am on the leading edge! Watch me go now. I just need 21C which came out in 2021 (clever name, eh?). Ok, but our team just got fully stable on 19C after a miserable few months, so I can’t ask us to upgrade.

This gives me an excuse to setup a project on Oracle Cloud Infrastructure (OCI) with the always-free option. I pop my API template up. Install the Universal Theme, fetch down a lovely CLOB filled with my JavaScript Object (variable). Because I am a database person, I know well how to do a SELECT on data. Christian’s hint about the “global JavaScript variable apexIcon” is still floating in the stupid corner of my brain. I don’t know how to connect that phrase with “write a script”.

Give me a minute on 21C with MLE, a goodly handful of brilliant articles, and I’ll master this task.

For those wincing at home, I have wasted significant number of hours, even days at this point.

I give Oracle my personal credit card to setup my free OCI session – and that has gotten easier over the years. Cheers to Oracle for streamlining that and working at catching up. Well done, you.

I write a bit of code with my new-found friend MLE then put it into SQL Developer linked to my Oracle Cloud. It runs perfectly but I don’t see anything. I look carefully at the blog posts to see folks using APEX SQL Workshop “SQL Commands” to run this stuff. Ok, I haven’t been in SQL Commands since 2015. Copy/paste, run!

DECLARE
   ctx      varchar2(50);
   l_js     clob;
   l_var    clob;
BEGIN
    l_js := q'~
var apexIcons=
{
	ACCESSIBILITY:[
		{name:"fa-american-sign-language-interpreting"},
		{name:"fa-asl-interpreting"},
		{name:"fa-assistive-listening-systems"},
		{name:"fa-audio-description"},
		{name:"fa-blind"},
		{name:"fa-braille"},
		{name:"fa-deaf"},
		{name:"fa-deafness"},
		{name:"fa-hard-of-hearing"},
		{name:"fa-low-vision"},
		{name:"fa-sign-language"},
		{name:"fa-signing"},
		{name:"fa-universal-access"},
		{name:"fa-volume-control-phone"},
		{name:"fa-wheelchair-alt"}
	]
};
    var gfg = JSON.stringify(apexIcons);
    console.log(gfg);
    console.log("Goodbye Cruel World!");
    ~';
   ctx := DBMS_MLE.create_context();
   DBMS_MLE.eval(ctx, 'JAVASCRIPT', l_js );	
   DBMS_MLE.drop_context(ctx);
END;

I am scriptifying!

So close. And fail!

Fail

I need to fetch my APEX Icon data. We’ve established I don’t know how to dance the JS rhythm. And I know I have the JavaScript object warm and happy in my Oracle Table API_STAGING.HTTP_RESPONSE. I’ll just query that plop that in there, stringify objects, then I’ve won the game.

I can’t even inventory my attempts, the hours are lost to me forever. I roll into an APEX page to play there only to blow error after error. Eventually I suspect my data. A query with dbms_lob.getlength informs me that I have 44,036 characters in my clob. This exceeds my 32,767 limits on varchar2 stuff, good ole 2 to the 14th power. Ugh.

Achievement

Ok, I scriptified within PLSQL. There was no particular object-oriented elegance to the effort. I do have a real job that requires hours, effort, and attention. So far, I’ve burned at least 4 days on this effort. I must return to my real job.

RESTive

Brute-force programming results in ugly and longish code. It is also embarrassing. I’ll not publish my mess. I do have some ego left. While many of my efforts with API normally result in a restful state for myself, my writing, and even for data. This process left me entirely restive and frustrated.

Instead, I’ll publish the results:

https://g2235c8c2b07256-cmoore.adb.us-ashburn-1.oraclecloudapps.com/ords/cm/font/apex

Filed Under: Oracle APEX Tagged With: API, failure, json, json.stringify, mle, multilingual engine, OCI, oracle, oracle apex, Oracle Cloud Infrastructure, plsql, restful

Oracle PL/SQL Storing Large Objects on AWS S3

8 June 2021 by Christina Moore

Abstract

Storing large files in an Oracle database can be expensive and cumbersome. The expense comes from the size of the on-line disk storage and the efforts to run backups with archive logs. Oracle Cloud Infrastructure introduced a REST API for large object storage (multi-media files, etc.) This feature has been available with Amazon Web Services (AWS) for years. I wrote and published an API for AWS S3 storage using PL/SQL in 2016 and 2017. With AWS S3, a development team can reliably store binary large objects (BLOB / multimedia files) securely and cost effectively. In this article, I’ll explore the advantages and updates I have made to the AWS S3 package.

Operational Challenge

When I went to play with AWS S3 in 2016, I discovered that the original work by Morten Braten and Jason Straub predated AWS’ version 4 security protocol. I have no memory of why I spent weeks working through my AWS4_S3_PKG that I published here: https://github.com/cmoore-sp/plsql-aws-s3 I am glad I did the work then.

Storm Petrel LLC earned a contract with the Government of Puerto Rico to support recovery efforts following the twin Hurricanes Irma and Maria that devastated the island during the autumn of 2017. By early 2018, we had 2,000 users on our Oracle APEX grants management application, Tempest-GEMS. Users entered financial data then uploaded supporting documentation (invoices, contracts) as PDF to our application. We stored these documents as BLOB in our Oracle table. We struggled to keep up with the demand for storage space. We also had an API that permitted specialty projects to upload documents. A few times, we experienced 20,000 new PDF in a single evening’s upload.

Data costs soared and our document table’s performance suffered some. When a document gets uploaded, we stored it in the table. Within 15 minutes, it would be duplicated into the transaction log by Oracle. Each evening, we ran full backups. Each weekend, we imaged our servers. 1 became 2, 2 becomes 4. If your archive/transaction log fills the disk space allocated, Oracle stops operating. We spent weeks chasing the exponential growth of our data storage.

On-line diskspace at AWS gets expensive with you get to very large sizes.

We created a support problem. We created a problem with escalating costs. We created a performance problem.

We identified decided to push our documents to external, long-term storage. The Oracle Cloud Infrastructure’s product did not then exist. Today, it would serve nicely as a solution. In February and March of 2018 rapidly implemented a process to store documents at AWS S3.

Storm Petrel’s Workflow

Our workflow took a very conservative approach. Our team supported the people of Puerto Rico through the early stages of a long-term financial recovery process. We must avoid failure and performance issues, both. People all over the islands were using mobile phones to push documents up to our application. We observed that pushing through Oracle directly to AWS S3 slowed the upload process – there was a two-hop between the user on a mobile network and the affirmative acknowledgement from AWS S3.

Storm Petrel opted to continue uploading the documents into a BLOB column within our documents table. We added several fields to this table:

  • S3 Bucket
  • S3 Prefix (file “path”)
  • S3 filename
  • S3 eTag
  • Document Filesize
  • S3 Version ID
  • S3 Last Modified Date
  • S3 Filesize
  • S3 Transferred
  • S3 confirmed
  • Status

These columns may appear redundant, but they provided us the incremental data we needed to be confident of our process.

  • A documents/multimedia file got uploaded to the table
    • The status set identify it remained in the table “only”
  • A DBMS scheduler job pushed the documents to S3
    • The status set to identify document is in the table and S3 “both”
    • We capture the S3 information such as eTag, version ID, modified date, filesize
  • A separate DBMS scheduler job confirms the S3 status. This runs hourly.
  • Weekly, on Sundays, we remove the BLOBs from the Oracle table that we have confirmed are present at S3

We could have chosen to be more aggressive with removing BLOBs from our table, executing the job daily. We would have saved space and thus costs for storage, but we took a balanced approach with costs and reliability. We run a full system image backup on Saturday nights. By deleting the BLOBs on Sunday, we gave ourselves a recovery path.

We executed this process on-the-fly while in production with thousands of simultaneous users.

By the end of 2018, we stored over 400,000 PDF documents. Those documents required 1.8Tb of storage at AWS S3. These were the financial records of over $5 billion U.S. grant moneys provided to the people and organizations of Puerto Rico – all of whom are U.S. citizen.

Storm Petrel did not win the long-term contract with Puerto Rico. That contract went to a much larger multi-national firm based in Canada. In December of 2018, we were tasked with providing the new contract with the documents. AWS S3 simplified the process of handing over the documents. The documents were isolated from our servers and databases. The hand-off resulted in very little risk to our systems.

On-Going Document Storage

Storm Petrel continues to support states and other organizations managing data related to disasters, including the COVID19 pandemic. The document storage process we implemented for Puerto Rico remains in place 2½ years later. The download process in Oracle APEX pulls from AWS S3 (or the Oracle table) seamlessly and rapidly. The user is never aware of the location and the download speed is fantastic (given your last-mile connection to Ye Olde Internet).

Challenges with Spanish

In the early days of pushing documents to AWS S3, we encountered challenges with filenames written in Spanish. Several of the characters commonly found on Spanish-based keyboard do not encode well for HTTP. We rapidly learned to use the document table’s primary key as the filename for S3. The filenames are entirely and totally ugly on AWS S3 given they are all strings of numbers.

PL/SQL AWS S3 Package Updated

Storm Petrel has written several application programming interfaces (API) in the recent years. I revisited my own work from 2016/2017. Some of that code includes work that dates to 2011. I strove to consolidate code more aggressively. The package has lost about 400 lines of code for the same performance.

AWS Workflow

Following the workflow above, these are the procedures/functions that get called.

  • A DBMS scheduler job pushed the documents to S3
    • Call procedure that loops through the document table for any document not yet pushed to S3
    • Calls AWS4_S3_PKG.Put_Object – copying the BLOB from the Oracle table to a designated AWS S3 bucket. “Key” to AWS is the file path.
  • A separate DBMS scheduler job confirms the S3 status. This runs hourly.
    • Calls procedure AWS4_S3_PKG.Object_Head – this process gets the HTTP “head” or header data from AWS. This biographical information provides us with the eTag, size, date, and last modified information in the headers returned from AWS (apex_web_service.g_headers).
    • We store the AWS object header data into our documents table after verifying the match.

We do not provide a means of deleting BLOBs nor the corresponding files on S3. We mark rows as “archived”. All of our queries, reports, procedures disregard data identified as archived (archived = ‘N’). We had engaged in this process long before arriving in Puerto Rico. Given that we work with financial funds related to the U.S. government, we tag data changes with date/time stamps, username, and the IP address of the change.

Of course, we did encounter a situation in 2018, where we were informed by the government that we’d lost 5,000 documents. They magically disappeared on a Saturday morning. Within an hour, we provided proof that the documents existed but were marked “archived” by a specific user. We provided the date, time, and IP address of the actions – actions that could only be executed with deliberate intent.

GitHub

I did nothing to maintain the PL/SQL code for the AWS S3 interface at my GitHub site for the last years. I know we had run into issues with bugs and such. The most common error is that our Oracle code and AWS disagreed about the time. Often it was the time zone offset from UTC.

I am not going to reconcile the old code with new. Instead, the code we’ve run in production is being posted. It is backwards compatible.

Please take from it as you will and enjoy.

The Github link is: https://github.com/cmoore-sp/plsql-aws-s3

Hey, of course, confidential stuff has been replaced with ‘xxxxx’. Do substitute your own API keys and such where needed.

Filed Under: Oracle APEX Tagged With: API, Application Express, AWS, AWS API, AWS S3, BLOB, Large Object Storage, OCI, OCI API, oracle apex, Oracle Cloud Infrastructure

Footer

We are Storm Petrel

We are a team of professionals dedicated to designing, building, hosting, and supporting enterprise-class software applications using Oracle APEX and PL/SQL

Learn more about us.

Powered by Oracle

PO Box 96, West Halifax VT 05358

sales@storm-petrel.com

  • #979 (no title)
  • GSA Schedule Information
  • Blogs
  • Privacy Policy

Copyright © 2025 ·Storm Petrel LLC

  • #979 (no title)
  • GSA Schedule Information
  • Blogs
  • Privacy Policy