style="padding-right: 20px; padding-bottom: 20px;" title="Oracle OpenWorld Bloggers Meetup" src="//www.pythian.com/news/wp-content/uploads/BM-OOW2.jpg" alt="Oracle OpenWorld Bloggers Meetup" width="200" height="200" align="left" />DON’T PANIC. Yes, we are doing the OOW Bloggers Meetup this year. Yes, it’s the same time. Yes, it’s the same location. Yes, it’s more fun every year.
Adaptive Query optimization, was a set of new capabilities, introduced in oracle 12c, to allow the optimizer to discover additional information regarding statistics and make run-time adjustments to execution plans to make them better. This is a major change in the optimizer behaviour from 11g.
I would recommend anyone who is planning an upgrade to 12c, that they make themselves familiar with the following white papers from oracle.
On September 12, 2016, a security vulnerability affecting all current versions of MySQL and its various vendors: Oracle, Percona Server, and MariaDB. The vulnerability, numbered
href="http://legalhackers.com/advisories/MySQL-Exploit-Remote-Root-Code-Execution-Privesc-CVE-2016-6662.html" target="_blank">CVE-2016-6662, allows an attacker to override the MySQL config file with various settings for the next time MySQL is restarted.
Data has a way of growing so quickly that it can appear unmanageable, but that is where data archiving shines through. The tool of archiving helps users maintain a clear environment on their primary storage drives. It is also an excellent way to keep backup and recovery spheres running smoothly. Mistakes and disasters can and will happen, but with successful archiving, data can be easily recovered so as to successfully avert potential problems. Another benefit to ar
A common concern among companies is if their resources are being used wisely.Never before has this been more pressing than when considering data storage options.Teams question the inherent worth of their data, and often fall into the trap of viewing it as an expense that weighs too heavily on the budget.This is where wisdom is needed with respect to efficient
Databases or schemas tend to get moved around between different servers or even datacenters for hardware upgrades, consolidations or other migrations. And while the work that needs to be done is pretty straight forward for DBAs, I find the most annoying aspect of that is updating all client connect strings and tns entries used with new IP addresses and – if not using services – also the SID as the instance name might have changed.
Hashicorp’s Terraform is a powerful tool for managing diverse infrastructure as code, and automating deployment tasks at the infrastructure layers using provider-exposed APIs such as those provided by AWS and vSphere.
This is just a short note that links to different articles on Alberto Dell'Era's blog. Alberto is a long time member of the OakTable and has published a number of posts I find very useful and therefore are linked here: