Category Archives: Art of BI
Which PostgreSQL Upgrade Option Should You Choose?
When you’re upgrading your PostgreSQL database, you have two primary methods for handling the process. Properly preparing for your upgrade and selecting the right choice for your organization depends on fully understanding the available options.
Assessing Your PostgreSQL Upgrade Readiness
You don’t want to jump right into your upgrade, as that can lead to a major disruption of your daily business operations. Careful planning is the most important step, especially when you’re working with mission-critical systems.
You’ll want full visibility into your current PostgreSQL environment, as well as any issues that could cause problems with an upgrade. Some of the factors to pay the most attention to include your current PostgreSQL extensions, the version compatibility parameters between both databases, Reg* data types, and any unknown data types present.
Once you establish the PostgreSQL baseline, you want to create a timeline that offers sufficient time to fully migrate your organization’s data, make any changes to your applications for compatibility purposes, and go through a complete testing cycle to catch any problems before they make it to production.
Establishing this schedule depends on the IT resources available to manage the upgrade, database sizes, the maintenance windows scheduled, the amount of storage you have available, and the scope of any compatibility changes.
Once you’ve reached the point in planning where you’ve answered all of these questions, you’re able to start exploring which PostgreSQL upgrade option makes the most sense for your organization. You may need to tweak your plan depending on the one you select, but that’s a minor change that’s easy to handle. Trying to pick the process and then do all the planning can lead to a lack of efficiency if you have a mismatch.
Pg_dumpall Upgrade Process
PostgreSQL has a logical backup tool called pg_dumpall. This tool will dump all of the data contained on your databases to disk. You then reload it to the target PostgreSQL version to complete the upgrade.
This upgrade method excels when you’re working with smaller databases. The downtime is minimal when you don’t have a lot of data to dump, so you’re up and running on the new version quickly. This method also eliminates database fragmentation during the upgrade, which shrinks the table and index sizes. If you had a database get out of control with its storage space, this is one way to quickly handle it. You’re also able to set up upgrades where the source and target database are on different servers. Setting up upgrades on distributed servers or new hardware is simple with this utility.
One of the biggest problems with this upgrade option is that you have to either shut down the application completely or keep it in single-user mode during the dump process. You don’t want any writes on the source database while you’re upgrading, and this is the way to avoid that.
When you have applications that must be accessible at all times, taking them down for the dump process may be logistically challenging. You also need additional disk space for the new cluster and the dumped data.
A quick overview of the pg_dumpall upgrade process is:
1. Install PostgreSQL v13 binaries.
2. Install PostgreSQL extensions.
3. Initialize the PostgreSQL cluster.
4. Perform pg_dumpall/pg_restore.
5. Validate the data and objects in the database.
Pg_upgrade Upgrade Process
You can upgrade in-place when you use pg_upgrade. You get two methods for making this happen: copy mode and link mode. Your database can upgrade in seconds when you use link mode, while copy mode depends on how big the database is. It creates a copy of PGDATA from the source to the target.
Upgrading in place is logistically easier than moving between servers. It’s also far faster to upgrade big databases when you don’t need to move all that data around. When you use the link method, you’re also able to use the same storage for more optimization.
You can only perform this upgrade on the same server, and you do need to account for the downtime.
A quick overview of the pg_upgrade process is:
1. Install PostgreSQL 13 binaries.
2. Install the PostgreSQL cluster.
3. Install PostgreSQL extensions.
4. Execute pg_upgrade with -c option for a consistency check.
5. Execute pg_upgrade and review your logs.
6. Validate your data and objects.
Learn more about the upgrade process to PostgreSQL 13 in our white paper, “You Can’t Put Off a PostgreSQL v9.5 Upgrade Anymore – End of Life is Here” and contact us if you want to leverage our talented database specialists for your upgrades from start to finish.
The post Which PostgreSQL Upgrade Option Should You Choose? appeared first on Datavail.
Fixing Critical Database Outages By Moving to the Oracle Cloud
Experiencing serious availability issues with your Oracle databases? You’re not alone—but the good news is that help is available. One of our recent clients had exactly these problems before reaching out to us for an Oracle EPM cloud migration.
Datavail’s client is a global safety company that operates in 20 different countries around the world, servicing industries from aerospace and transportation to petrochemicals and mining. The client had two major reasons for working with Datavail:
- First, the client was using Oracle Hyperion EPM 11.1.2.3, which 11.1.x is scheduled to reach its end of support (EOS) date in December 2021. Past this date, Oracle customers will fall out of compliance, and miss out on valuable new features and functionality.
- Second, the client was suffering critical database outages at extremely inopportune times. In one particularly severe case, the client’s database went down for 72 hours at the start of a financial close period, depriving them of essential information.
As a result of these outage issues, the client knew that the time had come to migrate from on-premises to the Oracle EPM cloud. In particular, the fact that support and maintenance obligations would no longer be their responsibility sealed the deal for the client.
Datavail first helped the client put out its fires by getting the Hyperion system back online, and then worked with the client on a two-part migration that included both Oracle Financial Consolidation and Close (FCCS) and EPM Cloud Planning (PBCS). First, Datavail completed the FCCS migration, and then ensured that the PBCS group could leverage the FCCS structure and data. Beyond the migration itself, Datavail also helped the client handle some historical data validation issues, automate its data loads and backup processes, clean up and streamline their datasets, and get rid of unused parts of their Hyperion Planning application.
Thanks to the project’s success, the client has retained Datavail as their choice of Oracle managed services partner. The benefits that the client has seen from this collaboration with Datavail include:
- Dramatically improving load times by removing as much mapping code as possible from the data load process.
- Automating the process of loading ledgers in Microsoft Excel, saving hours of manual work for high-level employees.
- Streamlining parallel direct data loads into FCCS and Planning, shortening a task that once took hours into just minutes.
- Enabling quick changes to Hyperion Planning (e.g. adding new users) by moving to the cloud, instead of requiring a slow, tedious database refresh.
Looking to enact your own Oracle cloud migration? Datavail is an Oracle Platinum Partner with 17 different specializations, and we’ve helped more than 150 clients successfully migrate to the cloud. To learn more about how we helped this client, check out our case study Industrial Safety Company Stops Outages by Moving to Oracle EPM Cloud. You can also learn more about the impending Hyperion 11.1.2.4 deadline by downloading our white paper It’s the Eleventh Hour for Hyperion 11.1.2.4: Here’s What to Do.
The post Fixing Critical Database Outages By Moving to the Oracle Cloud appeared first on Datavail.
Aligning Your Cloud Adoption Costs with Your Expectations
A big selling point of the cloud for many companies is cost savings. Shifting capital expenses to operational expenses makes it easier to buy-in to cloud adoption, but it’s important to have a deep understanding of your costs so your total cost of ownership doesn’t exceed your expectations.
In a recent survey we conducted, we asked companies where they’re at in their cloud journey. Ten percent of respondents are 100 percent in the cloud, 61 percent use a hybrid cloud infrastructure, 21 percent are currently in the evaluation and planning stage, and 8 percent haven’t started on cloud adoption at all. Each stage of this journey has important costs to consider so that you can better plan for your future moves.
We found that 27 percent of organizations had cloud costs that were higher than they planned. You have several ways that you can better predict your cloud expenses to avoid surprises.
Understanding the Shift from CAPEX to OPEX
You’re fundamentally changing the way that you handle the bulk of your technology expenses with cloud-based solutions. The models you use to predict the total cost of ownership for on-premise systems don’t work with usage and subscription-heavy payments. Adjust your accounting to better predict the real-world costs of your cloud technology. It may take several quarters to pin down these numbers, but you’ll be able to build on the data as it comes in.
Consider Your Cloud Implementation and Optimization Costs
Look beyond the base cost of the cloud solution. How much will it cost to fully implement in your organization? You may need to change workflows, increase your network bandwidth, or expand your endpoint security to support mobile devices.
If you use an Infrastructure as a Service solution, you need to optimize it based on your requirements. Depending on the complexity of your project, you could end up paying significant amounts to get the best performance out of your cloud investment.
Keep a Close Eye on Your Cloud Consumption
Monitor your real-world usage and adjust your cost predictions based on this data. Sometimes it’s hard to pin down exactly how many resources you need, especially when you’re working with a usage-based payment model. Many cloud providers have calculators that allow you to get a general idea of your numbers, so you can better align them with your expected costs. Third-party tools are also available for cloud monitoring.
Develop a Scaling Plan
Unlike on-premise infrastructure, it’s simple to scale cloud workloads up and down as needed. Create a strategy that maximizes your flexibility, so you don’t overpay for capacity you’re not using. Don’t be afraid to adjust this plan as you gain more experience with your selected cloud platforms. Many systems offer automated scaling features to make this process even easier.
Use Reserved Instances for Predictable Workloads
If you have workloads that have static requirements or change very slowly, many cloud providers allow you to set up reserved instances. You pay for these instances on a long-term basis, such as a year upfront, and get a substantially decreased cost.
Work with an Experienced Cloud Migration Partner
One way to get better insights into the cost of your cloud migration is to work with an experienced partner. At Datavail, we’ve guided hundreds of organizations through cloud migrations and modernization. Our experience leads to cost savings throughout the entire process, allowing you to deploy cloud-based solutions faster, optimize your cloud infrastructure, and plan around well-informed expense predictions.
We’re an AWS Advanced Consulting Tier Partner, an Oracle Platinum Partner, a MongoDB Premier Partner, and a Microsoft Gold Partner with 15 years of experience and over 800 data architects and DBAs. No matter what type of cloud technology you’re migrating to, we’re able to help. Learn more about cloud adoption trends in our white paper. Contact us to get started on your cloud journey.
Read This Next
Cloud Adoption Industry Benchmark: Trends & Best Practices
Datavail partnered with TechValidate, an independent third-party surveyor, to conduct a cloud adoption industry benchmark survey. This paper takes a look at the results along with a big picture view on cloud history and trends.
The post Aligning Your Cloud Adoption Costs with Your Expectations appeared first on Datavail.
How Oracle EPM Cloud Can Improve Productivity in Your Financial Processes
Are you bogged down by slow, inaccurate financial close and consolidation processes? So was one of our recent clients—before Datavail stepped in to help them, that is.
Datavail’s client is a Fortune 500 company with tens of thousands of employees operating hundreds of gas stations and retail outlets across North America. The client’s primary impetus for working with Datavail was the impending deadline for Oracle Hyperion 11.1.2.4, which will reach its end of support (EOS) date in December 2021. Beyond that date, Oracle customers who keep using 11.1.2.4 will fall out of compliance, expose themselves to unpatched security flaws, and miss out on the latest features and functionality.
The Migration Process
The client decided to migrate to Oracle EPM Cloud instead of remaining on-premises. Datavail helped the client at every step of the process, first gathering requirements and building a project roadmap and timeline. Next, Datavail built and converted the client’s metadata, including their alternate hierarchies.
One roadblock: the client’s data location hierarchy was highly complex; each location needed monthly, quarterly, and annual reiteration, and a single location might be subdivided into multiple stores. To help with this issue, Datavail leveraged the Oracle Smart View feature, which is only available in the Oracle cloud, to build hierarchies more rapidly. In a few weeks, Datavail had already finished most of the project and delivered a working system in the cloud, with only data validation left to complete.
Datavail helped the client identify several inaccuracies and errors in their datasets and reports, implementing some behind-the-scenes changes to dramatically speed up the reporting process. Although the size of the client’s enterprise data was one of the largest Datavail had ever worked with (e.g. 5,000 rows of data across 50 columns simultaneously), the system maintained high performance throughout, with marked improvements upon completion of the upgrade. For example, a consolidation process that took 10 minutes on-premises now takes just 2 minutes in the cloud.
The Benefits
If you’re thinking about your own large EPM cloud migrations or Oracle EPM applications, you’re in expert hands with Datavail. Thanks to the partnership with Datavail, the client gained the following time-saving benefits:
- A fivefold speedup for their financial consolidation processes that will free up hours of time for employees in the long run, boosting productivity and efficiency.
- Dramatic changes and improvements to reporting and analytics workflows, including correcting many inefficiencies and errors
- A hybrid onshore/offshore model that could scale to fit the client’s needs, even as the scope of the project expanded.
These improvements established a solid foundation for future improvements and Oracle cloud migrations.
Want the full story? Download the complete case study Major Retailer Improves Productivity by Moving to Oracle EPM Cloud.
Read This Next
It’s the Eleventh Hour for Hyperion 11.1.2.4 — Here’s What to Do
With all the excitement of this long-awaited release, it was also the beginning of the end for Hyperion 11.1.2.4, the previous version of the software, which will reach its end of support date in December 2021. So what should you do? In this white paper, we’ll discuss how you can take action to protect your business.
The post How Oracle EPM Cloud Can Improve Productivity in Your Financial Processes appeared first on Datavail.
The Types of Databases Powering the Cloud
We recently conducted a survey on cloud adoption, and one of the questions we touched upon was the type of databases powering the cloud. Our respondents leverage a wide range of database technologies for their cloud approaches. Here are the top selections, presented in order of popularity.
1. Microsoft SQL Server
Microsoft SQL Server was the overwhelmingly most popular database selection, with 140 respondents. It is a strong general-purpose relational database that is widely supported across many cloud platforms. You can deploy it on Windows and Linux servers, as well as containers. One of its biggest advantages is being able to query other databases’ data in-place. SQL Server 2019 also added Spark and HDFS support out of the box. You can work with both structured and unstructured data and use your programming language of choice.
2. Oracle
More than 80 respondents use Oracle to power their cloud adoption. This widely used database technology offers a multi-model database management system. It also supports MySQL, NoSQL, and in-memory databases. Oracle offers many types of implementation, as well as deep integration with their other solutions. It’s powerful with significant reliability and commercial support, making it popular among larger organizations and those with particularly demanding workloads.
3. MySQL
MySQL is a general purpose open-source database known for its low total cost of ownership, user-friendliness, and support for scaling OLTP applications. Over 40 respondents use this database for their cloud adoption strategy. Replication features offer high-performance and reliability, while InnoDB integration brings ACID compliance to the table.
4. PostgreSQL
PostgreSQL is another open-source relational database finding itself high on the list, with over 20 respondents. This database has been around for more than 30 years, is ACID compliant, and is known for being extremely reliable. A major advantage of this platform is that it offers a lot of flexibility. You can easily add custom data types, develop custom functionality, integrate add-ons from the active developer community, and it’s all available for free.
5. IBM Db2
IBM Db2 is the choice for 20 respondents. It’s a relational database that leverages artificial intelligence for modern applications. It supports multi-cloud and on-premise deployments, and offers both structured and unstructured data storage. This enterprise-grade database is commonly used in IBM host environments.
6. MongoDB
MongoDB is one of the most commonly used document stores, designed for general purpose use. Organizations of all sizes leverage this platform, and the features support many modern applications. Transactional, operational, and analytical applications are all supported in a single database, and it has significant support among third-party developers.
7. MariaDB
MariaDB is an open-source relational database that is compatible with MySQL and Oracle, offers a column-oriented storage engine, and has JSON support. You can put your transactional, analytical, and hybrid workloads on the same database technology, and use row and column storage as needed for each use case. Deployment options include using it as a relational database, setting it up as a distributed SQL database, or powering a data warehouse with it. You can plug-in different storage engines to optimize each workload.
8. Cassandra
Cassandra is a wide-column store, NoSQL database. It’s designed to support multi-cloud and hybrid cloud environments, with reliable performance, high scalability, and features that power modern applications. Operating this database is intentionally kept simple so the total cost of ownership stays low.
Moving to a Modern Database
At Datavail, we’ve guided hundreds of customers through database modernization and cloud migration and have extensive expertise with all mentioned databases. We’re partners and certified with many database platforms, including Oracle, MongoDB, AWS, and Microsoft.
We can help you bring your databases up to speed with end-to-end service. Learn more about cloud adoption trends by reading our white paper.
The post The Types of Databases Powering the Cloud appeared first on Datavail.