Share this
How to build a cost-effective serverless blogging platform on GCP - Part 2
by Maris Elsins on Mar 14, 2019 12:00:00 AM
Serverless Deployment
Going serverless made a lot of sense to me, especially because my blog is not very well-attended. The costs of these services are calculated only for the actual resource consumption. GCP App Engine also provides some resources under the Always Free program and based on my calculations, my resource usage would be low enough to be able to use the service free of charge. Another reason for using serverless technologies is the learning opportunity. There is a common trend to decompose large, monolithic applications into smaller pieces and move as much as possible to the serverless world, which allows simpler management of the individual functions. I wanted to learn a little about how it works in practice. Unfortunately, using GCP App Engine was too simple, so my learning goal was not really fulfilled.GCP App Engine
At this moment, we have our blog files stored locally in the_site directory. Fortunately, GCP App Engine allows deploying and serving them very easily, and as the site is built using only static files, everything is even more simple. In the root folder of our git repository we need to create a file app.yaml with the following content to tell how the deployment needs to be done:
$ cat app.yaml
runtime: python27
api_version: 1
threadsafe: true
handlers:
- url: /
static_files: _site/index.html
upload: _site/index.html
secure: always
redirect_http_response_code: 301
- url: /(.*)
static_files: _site/\1
upload: _site/(.*)
secure: always
redirect_http_response_code: 301
The file basically defines how each URL is handled by the application:
- when the root URL https://<domain>/ is accessed, the
_site/index.htmlwill be displayed - when any other URL https://<domain>/<path> is requested, it will serve the
_site/<path>file secure: alwaystells that the https protocol will always be usedredirect_http_response_code: 301means that whenever an HTTP request is made, it will be redirected to the corresponding HTTPS URL by using the HTTP 301 response.
$ git add app.yaml
$ git commit -m "adding the App Engine deployment file"
[master eeea76a] adding the App Engine deployment file
1 file changed, 22 insertions(+)
create mode 100644 app.yaml
$ git push
Counting objects: 3, done.
Delta compression using up to 2 threads.
Compressing objects: 100% (3/3), done.
Writing objects: 100% (3/3), 451 bytes | 0 bytes/s, done.
Total 3 (delta 1), reused 0 (delta 0)
remote: Resolving deltas: 100% (1/1)
To https://source.developers.google.com/p/tempblog-me-dba/r/tempblog-me-dba-repo
c7e7fc4..eeea76a master -> master
And finally, the blog can be deployed. You'll have to choose the region to deploy the blog as part of this process.
$ gcloud app deploy
You are creating an app for project [tempblog-me-dba].
WARNING: Creating an App Engine application for a project is irreversible and the region
cannot be changed. More information about regions is at
<https://cloud.google.com/appengine/docs/locations>.
Please choose the region where you want your App Engine application
located:
[1] asia-east2 (supports standard and flexible)
[2] asia-northeast1 (supports standard and flexible)
[3] asia-south1 (supports standard and flexible)
[4] australia-southeast1 (supports standard and flexible)
[5] europe-west (supports standard and flexible)
[6] europe-west2 (supports standard and flexible)
[7] europe-west3 (supports standard and flexible)
[8] northamerica-northeast1 (supports standard and flexible)
[9] southamerica-east1 (supports standard and flexible)
[10] us-central (supports standard and flexible)
[11] us-east1 (supports standard and flexible)
[12] us-east4 (supports standard and flexible)
[13] us-west2 (supports standard and flexible)
[14] cancel
Please enter your numeric choice: 10
Creating App Engine application in project [tempblog-me-dba] and region [us-central]....done.
Services to deploy:
descriptor: [/home/vagrant/tempblog-me-dba-repo/app.yaml]
source: [/home/vagrant/tempblog-me-dba-repo]
target project: [tempblog-me-dba]
target service: [default]
target version: [20190122t031126]
target url: [https://tempblog-me-dba.appspot.com]
Do you want to continue (Y/n)? Y
Beginning deployment of service [default]...
??????????????????????????????????????????????????????????????
?? Uploading 16 files to Google Cloud Storage ??
??????????????????????????????????????????????????????????????
File upload done.
Updating service [default]...done.
Setting traffic split for service [default]...done.
Deployed service [default] to [https://tempblog-me-dba.appspot.com]
You can stream logs from the command line by running:
$ gcloud app logs tail -s default
To view your application in the web browser run:
$ gcloud app browse
Using a Custom Domain Name
The default URL was definitely something that I didn't want to keep because it had the unappealing "appspot.com" part. Therefore, I obtained my own domain name (you can google something like " cheap domain registrars" to find a registrar suitable for you. It should be possible to get one for around $10/year). Once you have your own domain name, you will need to map it to the App Engine deployment by following the instructions in the Custom Domain documentation. You will need to access your domain registrar to add a few more records to the DNS configuration as part of this process:- A TXT record to validate the ownership of the domain.
- The IPv4 and IPv6 A and AAAA records that point the domain to the App Engine's IP addresses.
$ gcloud app ssl-certificates list --filter="DOMAIN_NAMES=tempblog.me-dba.com"
ID DISPLAY_NAME DOMAIN_NAMES
11030802 managed_certificate tempblog.me-dba.com
$ gcloud app ssl-certificates describe 11030802
certificateRawData:
publicCertificate: |
-----END CERTIFICATE-----
-----BEGIN CERTIFICATE-----
-----END CERTIFICATE-----
displayName: managed_certificate
domainMappingsCount: 1
domainNames:
- tempblog.me-dba.com
expireTime: '2019-04-22T11:03:19Z'
id: '11030802'
managedCertificate:
lastRenewalTime: '2019-01-22T12:03:21.890738152Z'
status: OK
name: apps/tempblog-me-dba/authorizedCertificates/11030802
visibleDomainMappings:
- apps/tempblog-me-dba/domainMappings/tempblog.me-dba.com
Once these steps are completed, the blog should be available on the new URL - https://tempblog.me-dba.com. It's easy to see that we can stop the configuration here, as the blog is published and everything is working. Deploying a new blog post would require the following steps:
- Write the new blogpost in Markdown and store the new file in _posts directory of the repository.
- Commit the changes and push to master (so they are not lost).
- Rebuild the static files to update them by using Jekyll build:
$ docker run --rm --publish 4000:4000 --volume="$PWD:/srv/jekyll" --privileged -it jekyll/jekyll:latest bash -c "JEKYLL_ENV=production jekyll serve"
- Deploy the new site to GCP App Engine.
$ gcloud app deploy
Continuous Publishing
Now, we can set up a mechanism that will watch the source repository we created and detect when new code is committed to the master branch. When that happens, ajekyll build will be triggered to recreate the static files, and deploy them to the App Engine as a new version. This section involves using GCP Container Register for storing the Jekyll Docker image and GCP Cloud Build for automation of the work.
Pushing the Docker Image to GCP Container Registry
Before the image is uploaded to the GCP Container Registry, I'll add one more tiny script that will take care of running thejekyll build. I'm doing this because I experienced some file access privilege issues while the build was ongoing and they were too difficult to resolve. I think they had to do with the fact that the Docker container mounts the local /workspace directory of the Cloud Build instance to /srv/jekyll to access the required files. The script will simply copy all files to some local place (/u01) inside the container to work with them, and then will copy back the built _site folder. Here's what needs to be done (this could have been done with a Dockerfile too, but I find this way a little simpler to work with):
- Start the container with an interactive bash session and create the required script file:
$ docker run --rm --volume="$PWD:/srv/jekyll" --privileged -it jekyll/jekyll:latest bash # the following set of 8 lines is a single command bash-4.4# echo "rm -rf /u01 /srv/jekyll/_site > /dev/null mkdir /u01 chmod 777 /srv/jekyll /u01 cp -r /srv/jekyll/* /u01/ cd /u01/ JEKYLL_ENV=production jekyll build cp -r /u01/_site /srv/jekyll/ rm -rf /u01 > /dev/null" > /root/init_build.sh bash-4.4# chmod a+x /root/init_build.sh
- Commit the change to the image from another session while the container is running:
$ docker ps -a CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES ef8bd6153c06 gcr.io/tempblog-me-dba/jekyll:latest "/usr/jekyll/bin/ent…" 16 seconds ago Up 15 seconds 4000/tcp, 35729/tcp serene_blackwell # use the ID from the previous output $ docker container commit ef8bd6153c06 jekyll/jekyll:latest sha256:579bd4e195cff00abb38b967009f7727ce1814ae9f8677ce9d903a7683fcbd6e
- "Exit" from the interactive bash session of the running container.
- We're going to upload the image to GCP Container registry via command line, thus, the "Google Container Registry API" needs to be enabled for the GCP Project first. Log on to the GCP console, navigate to "APIs & Services" module, locate and click on the "Container Registry API", and finally enable it if it's not yet enabled.
- Docker images that are going to be pushed to the GCP Container Registry need to follow a specific naming of
[HOSTNAME]/[PROJECT-ID]/[IMAGE], so the first task is to tag the customized Jekyll image:$ docker image tag jekyll/jekyll:latest gcr.io/tempblog-me-dba/jekyll:latest $ docker image ls gcr.io/tempblog-me-dba/jekyll REPOSITORY TAG IMAGE ID CREATED SIZE gcr.io/tempblog-me-dba/jekyll latest 9667c6cbe572 19 hours ago 480MB
- Next, Docker needs to be configured to make it aware of the GCP Container Registry and how to access it. This process is a super simple one-liner in Google Cloud SDK. You should have it already installed if you're working on the same machine where you set up your access to the GCP Source Repository; otherwise, you'll need to install it again:
$ gcloud auth configure-docker The following settings will be added to your Docker config file located at [/home/vagrant/.docker/config.json]: { "credHelpers": { "gcr.io": "gcloud", "us.gcr.io": "gcloud", "eu.gcr.io": "gcloud", "asia.gcr.io": "gcloud", "staging-k8s.gcr.io": "gcloud", "marketplace.gcr.io": "gcloud" } } Do you want to continue (Y/n)? Y Docker configuration file updated. - Finally, we can push the Docker image to the container registry:
$ docker push gcr.io/tempblog-me-dba/jekyll:latest The push refers to repository [gcr.io/tempblog-me-dba/jekyll] bbe16356ce18: Pushed 4ec78dd675c8: Pushed ce753fc763b8: Layer already exists cb136294e186: Layer already exists bf8d0e7b5481: Layer already exists 7bff100f35cb: Layer already exists latest: digest: sha256:bc1dd9adf3c08422b474545c82a83f1e79fce23d7feaeb9941afa0b89c093b03 size: 1580
GCP Cloud Build
We'll use GCP Cloud Build to define the build and deploy processes. I must say I was very impressed by how little configuration is required and how simple it was to instruct the Cloud Build to do what I needed. In our case, we need Cloud Build to run the following two commands:- Build the new blog files:
$ docker run --rm --publish 4000:4000 --volume="$PWD:/srv/jekyll" --privileged -it jekyll/jekyll:latest bash -c "JEKYLL_ENV=production jekyll build"
- Deploy the new version to App Engine:
$ gcloud app deploy
cloudbuild.yaml in the root of the repository, and its contents should look like this:
$ cat cloudbuild.yaml
steps:
- name: 'gcr.io/cloud-builders/docker'
args: ['run', '--rm', '--volume=/workspace:/srv/jekyll', '--privileged', 'gcr.io/tempblog-me-dba/jekyll:latest', '/root/init_build.sh']
- name: "gcr.io/cloud-builders/gcloud"
args: ["app", "deploy"]
timeout: "600s"
$ git add cloudbuild.yaml
$ git commit -m "Adding the instructions file for Cloud Build"
$ git push
Next, a Build Trigger needs to be created to initiate the build when a commit to master happens. Navigate to "Cloud Build" -> "Triggers" in GCP Cloud Console. Enable the Cloud Build API if you're presented with the message that it's disabled. Then create a trigger:
- Select "Cloud Source Repository" option.
- Select the name of your repository (tempglog-me-dba-com in my case).
- Provide the following options:
- Give your trigger a name.
- Choose "branch" as Trigger Type ("Tag" is another option. In this case the trigger will watch for specific git tags that will initiate the build).
- Choose "Cloud Build configuration file (yaml or json)" as the Build Configuration.
- Make sure that the Build Configuration File Location is already "/ cloudbuild.yaml".
- Click on "Create Trigger".
Running the trigger manually[/caption] Navigate to "History" on the left side menu, you may see that the build runs for a while, and then fails: [caption id="attachment_106239" align="aligncenter" width="1301"]
The app deployment step fails at this point[/caption] In my case, the first step succeeded (it should be the same for you too), but the second step failed. The log entries visible on the same page displayed the following entry:
Step #1: API [appengine.googleapis.com] not enabled on project [520897212625].
Step #1: Would you like to enable and retry (this will take a few minutes)?
Step #1: (y/N)?
Step #1: ERROR: (gcloud.app.deploy) User [520897212625@cloudbuild.gserviceaccount.com] does not have permission to access app [tempblog-me-dba] (or it may not exist): App Engine Admin API has not been used in project 520897212625 before or it is disabled. Enable it by visiting https://console.developers.google.com/apis/api/appengine.googleapis.com/overview?project=520897212625 then retry
This means the App Engine Admin API needs to be enabled for the Service account that the Cloud Build uses - 520897212625@cloudbuild.gserviceaccount.com in this case. Navigate to "IAM & Admin" -> "IAM", find the service account listed, and add "App Engine Admin" role to that account: [caption id="attachment_106238" align="aligncenter" width="1480"]
Adding the "App Engine Admin" role to the service account[/caption] Additionally, the "App Engine Admin API' needs to be enabled by navigating to "APIs & Services". It may require a minute or two for the new privileges to take effect. Navigate back to the "Cloud Build" -> "History", open the details of the failed run, and click on "Retry", to run the build again. This time the build should succeed.
Test the Workflow
Everything is ready. The only thing left is testing the publishing of new blog posts. I'll create this simple file in the_posts directory and will commit it to the master (this is all that should be needed to publish new content to the blog):
$ cat _posts/2019-01-23-it_works.markdown
---
layout: post
title: "It Works!"
date: 2019-01-23 12:19:00 +0300
comments: true
categories: serverless blog
---
Congratulations! Your serverless blog works!
$ git add -A
$ git commit -m "It Works"
$ git push
In a short while (it sometimes took around 10 minutes after the new version became available on the App Engine, due to caching, I suppose) the post should be live - published on your new blog. Congratulations!
Closing Remarks
The solution provided in this article was designed to be as inexpensive as possible, and it relies on Always Free resources that GCP provides. Despite that, we should take care to avoid unexpected service charges in case GCP changes the limitations of Always Free or your new blog goes viral suddenly! Take your time to look at the "BIlling" -> "Budgets & alerts", and create an alert that will notify you if the service charges reach a certain level. I've set up the monthly budget of $2 USD, and configured an alert if 50% of the budget is consumed. I can rest assured that I will be notified if the costs go up suddenly without having to log on to the GCP console periodically to check it by myself. We haven't discussed troubleshooting and monitoring. This is a topic that would require an article of its own; however, it's good to know that applications running on the App Engine are logged and monitored by default and by going to the "App Engine" menu in the GCP console, you will be presented with a good amount of usage information. Also, Stackdriver can be used to check the HTTP access logs. I hope you've enjoyed this journey and have managed to build a working blogging platform on GCP by following these instructions. This is just a starting point, and there are definitely more things to learn and discover. There might be a learning curve for working with Jekyll - how to create new blog posts, how to customize the template and the page layouts, how to include images and where to place them in the repository, and so on. But don't be overwhelmed by the volume of new things to learn. There is plenty of helpful material on the internet already to research and follow. I also understand that the level of detail I've provided about working with GCP may be insufficient for people who have never touched it before. Fortunately, there's also plenty of learning material provided by Google, including documentation to read, sample projects to work on, and videos to watch. It's clear the cloud is not going away anytime soon, and the knowledge you gain will become handy in the future, too. Happy blogging!Share this
- Technical Track (816)
- Oracle (488)
- Database (229)
- MySQL (144)
- Cloud (133)
- Microsoft SQL Server (124)
- Open Source (84)
- Google Cloud (82)
- Microsoft Azure (67)
- Amazon Web Services (AWS) (63)
- Big Data (50)
- Cassandra (44)
- Google Cloud Platform (44)
- DevOps (38)
- Linux (28)
- Pythian (27)
- PostgreSQL (26)
- Podcasts (25)
- Site Reliability Engineering (23)
- Performance (22)
- Docker (21)
- Oracle E-Business Suite (21)
- DBA (18)
- Oracle Cloud Infrastructure (OCI) (18)
- MongoDB (17)
- Security (17)
- Hadoop (16)
- BigQuery (15)
- Amazon RDS (14)
- Automation (14)
- Exadata (14)
- Oracleebs (14)
- Snowflake (14)
- Ansible (13)
- Oracle Database (13)
- Oracle Exadata (13)
- ASM (12)
- Data (12)
- LLM (12)
- Artificial Intelligence (AI) (11)
- GenAI (11)
- Kubernetes (11)
- Machine Learning (11)
- Advanced Analytics (10)
- Datascape Podcast (10)
- Oracle Applications (10)
- Replication (10)
- Authentication, SSO and MFA (8)
- ChatGPT (8)
- Cloud Migration (8)
- Infrastructure (8)
- Monitoring (8)
- Percona (8)
- Analytics (7)
- Apache (7)
- Apache Cassandra (7)
- Data Governance (7)
- High Availability (7)
- Mariadb (7)
- Microsoft Azure SQL Database (7)
- Myrocks (7)
- Oracle EBS (7)
- Python (7)
- Rman (7)
- SAP (7)
- Series (7)
- AWR (6)
- Airflow (6)
- Apache Beam (6)
- Data Guard (6)
- Innodb (6)
- Migration (6)
- Oracle Enterprise Manager (OEM) (6)
- Orchestrator (6)
- RocksDB (6)
- Azure Synapse Analytics (5)
- Covid-19 (5)
- Data Enablement (5)
- Disaster Recovery (5)
- Microsoft (5)
- Performance Tuning (5)
- Scala (5)
- Serverless (5)
- Cloud Security (4)
- Cloud Spanner (4)
- CockroachDB (4)
- Data Management (4)
- Data Pipeline (4)
- Data Security (4)
- Data Strategy (4)
- Data Visualization (4)
- Databases (4)
- Dataflow (4)
- Generative AI (4)
- Google (4)
- Google BigQuery (4)
- Oracle Autonomous Database (Adb) (4)
- Oracle Cloud (4)
- Oracle Enterprise Manager (4)
- Redhat (4)
- Ssl (4)
- Windows (4)
- Xtrabackup (4)
- Amazon Relational Database Service (Rds) (3)
- Apex (3)
- Cloud Database (3)
- Cloud FinOps (3)
- Data Analytics (3)
- Data Migrations (3)
- Database Migration (3)
- Digital Transformation (3)
- ERP (3)
- Google Chrome (3)
- Google Cloud Sql (3)
- Google Workspace (3)
- Heterogeneous Database Migration (3)
- Oracle Live Sql (3)
- Oracle Rac (3)
- Perl (3)
- Power Bi (3)
- Prometheus (3)
- Remote Teams (3)
- Slob (3)
- Tensorflow (3)
- Terraform (3)
- Amazon Data Migration Service (2)
- Amazon Ec2 (2)
- Anisble (2)
- Apache Flink (2)
- Apache Kafka (2)
- Apexexport (2)
- Ashdump (2)
- Aurora (2)
- Azure Data Factory (2)
- Cloud Armor (2)
- Cloud Data Fusion (2)
- Cloud Hosting (2)
- Cloud Infrastructure (2)
- Cloud Shell (2)
- Cloud Sql (2)
- Conferences (2)
- Cosmos Db (2)
- Cosmosdb (2)
- Cost Management (2)
- Data Discovery (2)
- Data Integration (2)
- Data Quality (2)
- Data Streaming (2)
- Database Administrator (2)
- Database Consulting (2)
- Database Monitoring (2)
- Database Performance (2)
- Database Troubleshooting (2)
- Dataguard (2)
- Dataops (2)
- Enterprise Data Platform (EDP) (2)
- Events (2)
- Fusion Middleware (2)
- Gemini (2)
- Graphite (2)
- Infrastructure As Code (2)
- Innodb Cluster (2)
- Innodb File Structure (2)
- Innodb Group Replication (2)
- Liquibase (2)
- NLP (2)
- Nosql (2)
- Oracle Data Guard (2)
- Oracle Datase (2)
- Oracle Flashback (2)
- Oracle Forms (2)
- Oracle Installation (2)
- Oracle Io Testing (2)
- Podcast (2)
- Rdbms (2)
- Redshift (2)
- Remote DBA (2)
- Remote Sre (2)
- S3 (2)
- Single Sign-On (2)
- Webinars (2)
- X5 (2)
- AI (1)
- Actifio (1)
- Adop (1)
- Advanced Data Services (1)
- Afd (1)
- Alloydb (1)
- Amazon (1)
- Amazon Aurora Backtrack (1)
- Amazon Efs (1)
- Amazon Redshift (1)
- Amazon S3 (1)
- Amazon Sagemaker (1)
- Amazon Vpc Flow Logs (1)
- Analysis (1)
- Analytical Models (1)
- Anthos (1)
- Application Migration (1)
- Ash (1)
- Asmlib (1)
- Atp (1)
- Autonomous (1)
- Awr Data Mining (1)
- Awr Mining (1)
- Azure Data Lake (1)
- Azure Data Lake Analytics (1)
- Azure Data Lake Store (1)
- Azure Data Migration Service (1)
- Azure OpenAI (1)
- Azure Sql Data Warehouse (1)
- Batches In Cassandra (1)
- Business Insights (1)
- Business Intelligence (1)
- Chown (1)
- Chrome Security (1)
- Cloud Browser (1)
- Cloud Build (1)
- Cloud Consulting (1)
- Cloud Cost Optimization (1)
- Cloud Data Warehouse (1)
- Cloud Database Management (1)
- Cloud Dataproc (1)
- Cloud Foundry (1)
- Cloud Networking (1)
- Cloud SQL Replica (1)
- Cloud Scheduler (1)
- Cloud Services (1)
- Cloud Strategies (1)
- Compliance (1)
- Conversational AI (1)
- Cyber Security (1)
- Data Analysis (1)
- Data Analytics Platform (1)
- Data Box (1)
- Data Classification (1)
- Data Cleansing (1)
- Data Encryption (1)
- Data Engineering (1)
- Data Estate (1)
- Data Insights (1)
- Data Integrity (1)
- Data Leader (1)
- Data Lifecycle Management (1)
- Data Lineage (1)
- Data Masking (1)
- Data Mesh (1)
- Data Migration (1)
- Data Migration Assistant (1)
- Data Migration Service (1)
- Data Mining (1)
- Data Monetization (1)
- Data Policy (1)
- Data Profiling (1)
- Data Protection (1)
- Data Retention (1)
- Data Safe (1)
- Data Sheets (1)
- Data Summit (1)
- Data Vault (1)
- Data Warehouse (1)
- Database Consultant (1)
- Database Link (1)
- Database Management (1)
- Database Migrations (1)
- Database Modernization (1)
- Database Provisioning (1)
- Database Provisioning Failed (1)
- Database Replication (1)
- Database Schemas (1)
- Database Upgrade (1)
- Databricks (1)
- Datascape 59 (1)
- DeepSeek (1)
- Docker-Composer (1)
- Duet AI (1)
- Edp (1)
- Etl (1)
- Gcp Compute (1)
- Gcp-Spanner (1)
- Global Analytics (1)
- Google Analytics (1)
- Google Cloud Architecture Framework (1)
- Google Cloud Data Services (1)
- Google Cloud Partner (1)
- Google Cloud Spanner (1)
- Google Cloud VMware Engine (1)
- Google Compute Engine (1)
- Google Dataflow (1)
- Google Datalab (1)
- Google Grab And Go (1)
- Graph Algorithms (1)
- Graph Inferences (1)
- Graph Theory (1)
- GraphQL (1)
- Health Check (1)
- Healthcheck (1)
- Information (1)
- Infrastructure As A Code (1)
- Innobackupex (1)
- Innodb Concurrency (1)
- Innodb Flush Method (1)
- It Industry (1)
- Kubeflow (1)
- LMSYS Chatbot Arena (1)
- Linux Host Monitoring (1)
- Linux Storage Appliance (1)
- Looker (1)
- MMLU (1)
- Managed Services (1)
- Migrate (1)
- Neo4J (1)
- Newsroom (1)
- Nifi (1)
- OPEX (1)
- Odbcs (1)
- Odbs (1)
- On-Premises (1)
- Open Source Database (1)
- Ora-01852 (1)
- Ora-7445 (1)
- Oracle Cursor (1)
- Oracle Database@Google Cloud (1)
- Oracle Exadata Smart Scan (1)
- Oracle Licensing (1)
- Oracle Linux Virtualization Manager (1)
- Oracle Oda (1)
- Oracle Openworld (1)
- Oracle Parallelism (1)
- Oracle RMAN (1)
- Oracle Rdbms (1)
- Oracle Real Application Clusters (1)
- Oracle Reports (1)
- Oracle Security (1)
- Perfomrance (1)
- Performance Schema (1)
- Policy (1)
- Prompt Engineering (1)
- Public Cloud (1)
- Pythian News (1)
- Rdb (1)
- Replication Error (1)
- Retail (1)
- SAP HANA Cloud (1)
- Securing Sql Server (1)
- Serverless Computing (1)
- Sso (1)
- Tenserflow (1)
- Teradata (1)
- Vertex AI (1)
- Videos (1)
- Workspace Security (1)
- Xbstream (1)
- August 2025 (1)
- July 2025 (3)
- June 2025 (1)
- May 2025 (3)
- March 2025 (2)
- February 2025 (1)
- January 2025 (2)
- December 2024 (1)
- October 2024 (2)
- September 2024 (7)
- August 2024 (4)
- July 2024 (2)
- June 2024 (6)
- May 2024 (3)
- April 2024 (2)
- February 2024 (1)
- January 2024 (11)
- December 2023 (10)
- November 2023 (9)
- October 2023 (11)
- September 2023 (9)
- August 2023 (6)
- July 2023 (2)
- June 2023 (13)
- May 2023 (4)
- April 2023 (6)
- March 2023 (10)
- February 2023 (6)
- January 2023 (5)
- December 2022 (10)
- November 2022 (10)
- October 2022 (10)
- September 2022 (13)
- August 2022 (16)
- July 2022 (12)
- June 2022 (13)
- May 2022 (11)
- April 2022 (4)
- March 2022 (5)
- February 2022 (4)
- January 2022 (14)
- December 2021 (16)
- November 2021 (11)
- October 2021 (6)
- September 2021 (11)
- August 2021 (6)
- July 2021 (9)
- June 2021 (4)
- May 2021 (8)
- April 2021 (16)
- March 2021 (16)
- February 2021 (6)
- January 2021 (12)
- December 2020 (12)
- November 2020 (17)
- October 2020 (11)
- September 2020 (10)
- August 2020 (11)
- July 2020 (13)
- June 2020 (6)
- May 2020 (9)
- April 2020 (18)
- March 2020 (21)
- February 2020 (13)
- January 2020 (15)
- December 2019 (10)
- November 2019 (11)
- October 2019 (12)
- September 2019 (16)
- August 2019 (15)
- July 2019 (10)
- June 2019 (16)
- May 2019 (20)
- April 2019 (21)
- March 2019 (14)
- February 2019 (18)
- January 2019 (18)
- December 2018 (5)
- November 2018 (16)
- October 2018 (12)
- September 2018 (20)
- August 2018 (27)
- July 2018 (31)
- June 2018 (34)
- May 2018 (28)
- April 2018 (27)
- March 2018 (17)
- February 2018 (8)
- January 2018 (20)
- December 2017 (14)
- November 2017 (4)
- October 2017 (1)
- September 2017 (3)
- August 2017 (5)
- July 2017 (4)
- June 2017 (2)
- May 2017 (7)
- April 2017 (7)
- March 2017 (8)
- February 2017 (8)
- January 2017 (5)
- December 2016 (3)
- November 2016 (4)
- October 2016 (8)
- September 2016 (9)
- August 2016 (10)
- July 2016 (9)
- June 2016 (8)
- May 2016 (13)
- April 2016 (16)
- March 2016 (13)
- February 2016 (11)
- January 2016 (6)
- December 2015 (11)
- November 2015 (11)
- October 2015 (5)
- September 2015 (16)
- August 2015 (4)
- July 2015 (1)
- June 2015 (3)
- May 2015 (6)
- April 2015 (5)
- March 2015 (5)
- February 2015 (4)
- January 2015 (3)
- December 2014 (7)
- October 2014 (4)
- September 2014 (6)
- August 2014 (6)
- July 2014 (16)
- June 2014 (7)
- May 2014 (6)
- April 2014 (5)
- March 2014 (4)
- February 2014 (10)
- January 2014 (6)
- December 2013 (8)
- November 2013 (12)
- October 2013 (9)
- September 2013 (6)
- August 2013 (7)
- July 2013 (9)
- June 2013 (7)
- May 2013 (7)
- April 2013 (4)
- March 2013 (7)
- February 2013 (4)
- January 2013 (4)
- December 2012 (6)
- November 2012 (8)
- October 2012 (9)
- September 2012 (3)
- August 2012 (5)
- July 2012 (5)
- June 2012 (7)
- May 2012 (11)
- April 2012 (1)
- March 2012 (8)
- February 2012 (1)
- January 2012 (6)
- December 2011 (8)
- November 2011 (5)
- October 2011 (9)
- September 2011 (6)
- August 2011 (4)
- July 2011 (1)
- June 2011 (1)
- May 2011 (5)
- April 2011 (2)
- February 2011 (2)
- January 2011 (2)
- December 2010 (1)
- November 2010 (7)
- October 2010 (3)
- September 2010 (8)
- August 2010 (2)
- July 2010 (4)
- June 2010 (7)
- May 2010 (2)
- April 2010 (1)
- March 2010 (3)
- February 2010 (3)
- January 2010 (2)
- November 2009 (6)
- October 2009 (6)
- August 2009 (3)
- July 2009 (3)
- June 2009 (3)
- May 2009 (2)
- April 2009 (8)
- March 2009 (6)
- February 2009 (4)
- January 2009 (3)
- November 2008 (3)
- October 2008 (7)
- September 2008 (6)
- August 2008 (9)
- July 2008 (9)
- June 2008 (9)
- May 2008 (9)
- April 2008 (8)
- March 2008 (4)
- February 2008 (3)
- January 2008 (3)
- December 2007 (2)
- November 2007 (7)
- October 2007 (1)
- August 2007 (4)
- July 2007 (3)
- June 2007 (8)
- May 2007 (4)
- April 2007 (2)
- March 2007 (2)
- February 2007 (5)
- January 2007 (8)
- December 2006 (1)
- November 2006 (3)
- October 2006 (4)
- September 2006 (3)
- July 2006 (1)
- May 2006 (2)
- April 2006 (1)
- July 2005 (1)
No Comments Yet
Let us know what you think