Tag Archives: Cloud

Why Access to Software Portals Is More Important Than Ever


The COVID-19 pandemic has upended the way businesses of all sizes and industries operate—in particular, the places that employees do their work. According to an October 2020 Gallup poll, 33 percent of U.S. employees say that they are “always” working remotely, while another 25 percent say that they “sometimes” telecommute.


Faced with this rapid and unexpected shift to working from home, organizations have had to make sudden changes and evolutions, especially regarding their IT systems and software. Despite the pandemic, businesses must ensure that their employees, customers, partners, and third-party vendors can enjoy continued access to key software applications and services. These concerns are especially relevant for companies with locations and employees scattered across the country, around the world, or those who must navigate different time zones, regulatory environments, and more.

Although the pandemic is waning, its effects and repercussions will be long lasting. Businesses that take advantage of this time of change, uncertainty, and turbulence will position themselves well for whatever comes next.

For some companies who have not adopted a resilient mindset and a transformation-focused organizational culture, the pandemic has been a challenge to their business—often an existential one. Others, however, have seen the COVID-19 pandemic as an affirmation that justifies the investments they previously made in their IT infrastructure. Below are just some digital technologies that have paid dividends for their users:

  • Cloud computing for easier access to applications and services and rapid horizontal and vertical scalability.
  • Identity management (IdM) solutions to monitor and control employees’ access to business-critical applications.
  • Automation of tedious manual processes, freeing up employees for higher-level, more revenue-generating activities.
  • Data analytics to collect, process, analyze and visualize vast quantities of information, mining it for insights to enable more accurate forecasts and smarter decision-making.
  • Workplace communication and collaboration tools (e.g., Microsoft Teams, Slack, Trello, Zoom, etc.).


The rise in telecommuting has fostered the growth of SaaS (“software as a service”) usage. Software applications running in the cloud have greatly expanded user connectivity and productivity. Cloud-native software services have many advantages, with a few of them especially relevant in this day and age:

  • Support and maintenance for SaaS applications is the responsibility of the software vendor, rather than the in-house IT department. This means less stress on overburdened IT teams, and not waking up at 3 a.m. when a server goes down.
  • As a corollary, SaaS upgrades are rolled out smoothly and automatically, without the need for business-disrupting downtime.
  • Users can access SaaS applications from anywhere with an Internet connection and at any time—a vital asset during this time of telecommuting, but also tremendously convenient in general.
  • During times of uncertain demand, SaaS applications can easily scale to accommodate spikes in usage without degrading performance.
  • By using a centralized, streamlined solution rather than disconnected legacy systems, SaaS improves visibility and makes IT governance easier.


In particular, SaaSOps (“SaaS operations”) i.e. managing and monitoring an organization’s use of SaaS applications, is becoming more and more relevant and important. According to BetterCloud’s “2020 State of SaaS” report:

  • Organizations use an average of 80 SaaS applications.
  • IT teams spend an average of more than seven hours offboarding an employee from the company’s SaaS applications after they depart the organization.
  • Only 49 percent of IT professionals are confident in their ability to detect unauthorized SaaS usage on the company network.


To confront these SaaSOps challenges, organizations need a centralized coordinated approach. One of the easiest IT projects for your business to take on—yet one of the most impactful for employee productivity and user experience—is to build a clean, streamlined application portal with single sign-on (SSO), simplifying the process of logging in and using enterprise software.

With a single application portal, users can enter their credentials and access the services they need to do their jobs efficiently, from anywhere and at any time.

Looking to implement your own software application portal? Datavail can help. To learn how we helped one client implement a secure application portal in the Microsoft Azure cloud, check out our recent case study “Major Auto Manufacturer Migrates Application Portal to Azure Cloud.”

The post Why Access to Software Portals Is More Important Than Ever appeared first on Datavail.

The Importance of Database Modernization for Cloud Adoption

Getting the most out of cloud technology involves far more than simply adopting cloud-based infrastructure. Older databases shifted into the cloud may not serve the needs of modern applications. As your users have more complex use cases, with a growing amount of data and data sources, real-time feature requests, and rapid scaling requirements, your database technology needs to change too.


Database modernization is spread throughout multiple stages, as your organization goes through testing, evaluation, and pilot projects to determine which areas benefit the most from upgrading the database.

In our recent cloud adoption survey, 46 percent of respondents planned on using or are considering modern data platforms, 34 percent are remaining on their current solution, 22 percent are considering a move in the future, 19 percent are evaluating and planning their migration, 15 percent are in the process of migration, and 10 percent are building new applications on new platforms and keeping legacy applications on old platforms.

As you can see, the way that organizations go about database modernization comes in many forms.

Benefits of Database Modernization

Gaining Purpose-built Databases
Databases are not a one-size-fits-all technology, but some organizations opt for the same technology no matter what application it’s for. By matching purpose-built databases to specific use cases, you can improve performance, expand functionality, and get the data structures that make sense for the project.

Reducing Costs
Modernizing your databases can also lead to lower expenses. Since they’re fine-tuned for a specific purpose, you get access to the features you need without paying for those that are not useful. These database platforms often require less time spent on maintenance, security, and optimization, so your database administrators and system administrators have more time in their workdays.

Better Reliability
Modern database platforms are filled with features that help your systems stay online and meet SLAs, including high availability, distributed processing, and robust disaster recovery options. If you’re using cloud-native database solutions, you also have the advantage of using technology specifically designed to get the most out of the cloud.

Fast Provisioning
You drastically reduce the amount of time it takes to spin up a new database instance and often enjoy a streamlined process. For some database platforms, all you need to do is click a button. Scaling is also simple, with many solutions offering automated control over the resources you’re using.

Signs That You Should Modernize Your Databases

  • Difficulty in keeping up with growing usage: Your users and workloads are rapidly increasing, and the system is starting to strain under the pressure. Performance issues abound and make it difficult to achieve peak productivity.
  • Inability to work with new data sources and structures: As new data sources and structures develop, older databases may not support these formats. You could lose out on valuable insights or end up with a major opportunity cost in the long run.
  • Increased demands on the IT team to keep the system running: Frequent downtime, crashes, errors, and other issues add up fast with older databases. You also have to worry about security exploits and other vulnerabilities occurring with databases that are past their prime or end of life.
  • Struggles meeting SLAs: You fail to meet your SLAs due to issues with the system, whether it becomes inaccessible or has extremely slow performance.
  • Database costs rising uncontrollably: Propping up older technology can become expensive in many ways, from the resources required to keep it operational to sourcing specialists of less popular databases.

Moving to Modern Databases in the Cloud with Datavail

As a leading cloud partner with AWS, Microsoft, Oracle and MongoDB we can help with your database modernization and cloud migration. We have more than 15 years of experience and over 800 data architects and database administrators ready to move your applications to cutting-edge databases. Learn more about the cloud adoption journey and see the results from the rest of the survey in our latest paper. Contact us to get started.

Read This Next

Modernize Legacy Tech with MongoDB

Your organization is probably running technology that is past its prime, and you probably know you need to update and upgrade it all to maintain your corporate competitiveness. In short, you need to ‘modernize,’ and MongoDB provides you with the tools you’ll need to bring all your tech – software, apps, and systems – up to speed.

The post The Importance of Database Modernization for Cloud Adoption appeared first on Datavail.

Aligning Your Cloud Adoption Costs with Your Expectations

A big selling point of the cloud for many companies is cost savings. Shifting capital expenses to operational expenses makes it easier to buy-in to cloud adoption, but it’s important to have a deep understanding of your costs so your total cost of ownership doesn’t exceed your expectations.

In a recent survey we conducted, we asked companies where they’re at in their cloud journey. Ten percent of respondents are 100 percent in the cloud, 61 percent use a hybrid cloud infrastructure, 21 percent are currently in the evaluation and planning stage, and 8 percent haven’t started on cloud adoption at all. Each stage of this journey has important costs to consider so that you can better plan for your future moves.

We found that 27 percent of organizations had cloud costs that were higher than they planned. You have several ways that you can better predict your cloud expenses to avoid surprises.

Understanding the Shift from CAPEX to OPEX

You’re fundamentally changing the way that you handle the bulk of your technology expenses with cloud-based solutions. The models you use to predict the total cost of ownership for on-premise systems don’t work with usage and subscription-heavy payments. Adjust your accounting to better predict the real-world costs of your cloud technology. It may take several quarters to pin down these numbers, but you’ll be able to build on the data as it comes in.

Consider Your Cloud Implementation and Optimization Costs

Look beyond the base cost of the cloud solution. How much will it cost to fully implement in your organization? You may need to change workflows, increase your network bandwidth, or expand your endpoint security to support mobile devices.

If you use an Infrastructure as a Service solution, you need to optimize it based on your requirements. Depending on the complexity of your project, you could end up paying significant amounts to get the best performance out of your cloud investment.

Keep a Close Eye on Your Cloud Consumption

Monitor your real-world usage and adjust your cost predictions based on this data. Sometimes it’s hard to pin down exactly how many resources you need, especially when you’re working with a usage-based payment model. Many cloud providers have calculators that allow you to get a general idea of your numbers, so you can better align them with your expected costs. Third-party tools are also available for cloud monitoring.

Develop a Scaling Plan

Unlike on-premise infrastructure, it’s simple to scale cloud workloads up and down as needed. Create a strategy that maximizes your flexibility, so you don’t overpay for capacity you’re not using. Don’t be afraid to adjust this plan as you gain more experience with your selected cloud platforms. Many systems offer automated scaling features to make this process even easier.

Use Reserved Instances for Predictable Workloads

If you have workloads that have static requirements or change very slowly, many cloud providers allow you to set up reserved instances. You pay for these instances on a long-term basis, such as a year upfront, and get a substantially decreased cost.

Work with an Experienced Cloud Migration Partner

One way to get better insights into the cost of your cloud migration is to work with an experienced partner. At Datavail, we’ve guided hundreds of organizations through cloud migrations and modernization. Our experience leads to cost savings throughout the entire process, allowing you to deploy cloud-based solutions faster, optimize your cloud infrastructure, and plan around well-informed expense predictions.

We’re an AWS Advanced Consulting Tier Partner, an Oracle Platinum Partner, a MongoDB Premier Partner, and a Microsoft Gold Partner with 15 years of experience and over 800 data architects and DBAs. No matter what type of cloud technology you’re migrating to, we’re able to help. Learn more about cloud adoption trends in our white paper. Contact us to get started on your cloud journey.

Read This Next

Cloud Adoption Industry Benchmark: Trends & Best Practices

Datavail partnered with TechValidate, an independent third-party surveyor, to conduct a cloud adoption industry benchmark survey. This paper takes a look at the results along with a big picture view on cloud history and trends.

The post Aligning Your Cloud Adoption Costs with Your Expectations appeared first on Datavail.

The Types of Databases Powering the Cloud

We recently conducted a survey on cloud adoption, and one of the questions we touched upon was the type of databases powering the cloud. Our respondents leverage a wide range of database technologies for their cloud approaches. Here are the top selections, presented in order of popularity.

1. Microsoft SQL Server

Microsoft SQL Server was the overwhelmingly most popular database selection, with 140 respondents. It is a strong general-purpose relational database that is widely supported across many cloud platforms. You can deploy it on Windows and Linux servers, as well as containers. One of its biggest advantages is being able to query other databases’ data in-place. SQL Server 2019 also added Spark and HDFS support out of the box. You can work with both structured and unstructured data and use your programming language of choice.

2. Oracle

More than 80 respondents use Oracle to power their cloud adoption. This widely used database technology offers a multi-model database management system. It also supports MySQL, NoSQL, and in-memory databases. Oracle offers many types of implementation, as well as deep integration with their other solutions. It’s powerful with significant reliability and commercial support, making it popular among larger organizations and those with particularly demanding workloads.

3. MySQL

MySQL is a general purpose open-source database known for its low total cost of ownership, user-friendliness, and support for scaling OLTP applications. Over 40 respondents use this database for their cloud adoption strategy. Replication features offer high-performance and reliability, while InnoDB integration brings ACID compliance to the table.

4. PostgreSQL

PostgreSQL is another open-source relational database finding itself high on the list, with over 20 respondents. This database has been around for more than 30 years, is ACID compliant, and is known for being extremely reliable. A major advantage of this platform is that it offers a lot of flexibility. You can easily add custom data types, develop custom functionality, integrate add-ons from the active developer community, and it’s all available for free.

5. IBM Db2

IBM Db2 is the choice for 20 respondents. It’s a relational database that leverages artificial intelligence for modern applications. It supports multi-cloud and on-premise deployments, and offers both structured and unstructured data storage. This enterprise-grade database is commonly used in IBM host environments.

6. MongoDB

MongoDB is one of the most commonly used document stores, designed for general purpose use. Organizations of all sizes leverage this platform, and the features support many modern applications. Transactional, operational, and analytical applications are all supported in a single database, and it has significant support among third-party developers.

7. MariaDB

MariaDB is an open-source relational database that is compatible with MySQL and Oracle, offers a column-oriented storage engine, and has JSON support. You can put your transactional, analytical, and hybrid workloads on the same database technology, and use row and column storage as needed for each use case. Deployment options include using it as a relational database, setting it up as a distributed SQL database, or powering a data warehouse with it. You can plug-in different storage engines to optimize each workload.

8. Cassandra

Cassandra is a wide-column store, NoSQL database. It’s designed to support multi-cloud and hybrid cloud environments, with reliable performance, high scalability, and features that power modern applications. Operating this database is intentionally kept simple so the total cost of ownership stays low.

Moving to a Modern Database

At Datavail, we’ve guided hundreds of customers through database modernization and cloud migration and have extensive expertise with all mentioned databases. We’re partners and certified with many database platforms, including Oracle, MongoDB, AWS, and Microsoft.

We can help you bring your databases up to speed with end-to-end service. Learn more about cloud adoption trends by reading our white paper.

The post The Types of Databases Powering the Cloud appeared first on Datavail.

Timestamp Functions and Presentation Variables in Oracle Cloud Analytics

One of the most popular Rittman Mead blog posts over the last 10 years is Timestamps and Presentation Variables. As we are seeing more and more migrations to OAC, we decided to review and revise this post for the latest version of Oracle Cloud Analytics (OAC), 105.4.0-140 as of October 2019. Read more about the latest updates here.


One could say that creating a chart is not the most complex task in the world of Business Intelligence but we would argue that creating a meaningful report that perfectly illustrates the message hidden in data and therefore adds value to the management is nowhere close to being easy!    A good way to make a report as informative as possible is to use trends and comparison. And to do so, a perfect tool would be the time analysis functions. For example comparing sales in a period of time this year to the same period of time the year before. Or measure the similarity or dissimilarity of sales in different months of the year.

Demo Platform

I have used a free trial instance of OAC for this demo. If you haven’t done yet, sign up for a free 30-day trial Oracle Cloud account (different to an Oracle account). Use the account to access the Oracle Cloud Infrastructure (OCI) console which is the latest Oracle movement towards having one integrated cloud platform to manage all your Oracle cloud applications, platforms, and infrastructure in one place.
From the OCI console it is 5 to 10 minutes before your free trial instance of OAC is up and running. For the detailed step by step of creating a new instance read here.

Demo Goals

In this blog post I intend to show you how to combine the power of timestamp functions and presentation variables to create robust, repeatable reports. We will create a report that displays a year over year analysis for any rolling number of periods, by week or month, from any date in time, all determined by the user. This entire demo will only use values from a date and a revenue field.


TIMESTAMPADD() manipulates data of the data types DATE and DATETIME based on a calendar year.

Syntax: TIMESTAMPADD(interval, expr, timestamp)
Example: TIMESTAMPADD(SQL_TSI_MONTH, 12,Time."Order Date")
Description: Adds a specified number of intervals to a timestamp, and returns a single timestamp.

Read more about other calendar functions.

Building Filters

Starting to build our demo, the filter below returns all dates greater than or equal to 7 days ago including the current date.

In other words we have now a functional filter to select all the rows where Date >= a week ago.

As a good practice, always include a second filter giving an upper limit to the time filter. For example "Periods"."Day Date" < CURRENT_DATE would confirm that there won’t be any records that you don’t want in the mix and therefore no unnecessary strain on the system.

Let’s go one step further, instead of going 7 days back, we could try and include all the previous days in the current month or in other words dates >= the first day of the month. In this scenario, we can use the DAYOFMONTH() function to get the calendar day of any date. From here it will be easy to calculate the number of days in the month so far. Our new filter would look like this:

For example, if today is October 16th, DAYOFMONTH(CURRENT_DATE) would equal 16. Thus, we would subtract 16 days from CURRENT_DATE to go back to September 30th, and adding one will give us October 1st.

Presentation Variables

A presentation variable is a variable that can be created from the front end, the Analytics as part of one of the following types of dashboard prompts:

  • Column prompt, Associated with a column and the values that it can take come from the column values. For information on working with column prompts, see Creating a Column Prompt.
  • Variable prompt, Not associated with any column, and you define the values that it can take. For information on working with variable prompts, see Creating a Variable Prompt.

Each time a user selects a value in the column or variable prompt, the value of the presentation variable is set to the value that the user selects and will then be sent to any references of that filter throughout the dashboard page. This could be filters, formulas and even text boxes.

The first presentation variable we could introduce is to replace the CURRENT_DATE with a prompted value. Let’s call this presentation variable pv_Date,

  • Use the syntax @{pv_Date} to call this variable in the reports.
  • For variables of type string, surround the name in single quotes: ‘@{pv_String]’
  • It is good practice to assign a default value to the presentation variables so that you can work with your report before publishing it to a dashboard. For example the default value for the pv_Date is CURRENT_DATE so the new syntax would be @{pv_Date}{CURRENT_DATE}

Demo Time!

Our updated filter after replacing the CURRENT_DATE looks like below. Will will refer to this filter later as Filter 1 (F1).

The filter is starting to take shape. Now let's say we are going to always be looking at a date range of six months before the selected date. All we would need to do is create a nested TIMESTAMP function. To do this, we will “wrap” our current TIMESTAMP with another that will subtract six months:

Now we have a filter to select dates that are greater than or equal to the first day of the month of any given date and all the six months prior to that.

To take this one step further, we can create another presentation variable called  pv_n to allow the users to determine the amount of months to include in this analysis from a dashboard prompt.

Here is the updated version of our filter using the number of periods presentation variable and a default value of 6, @{pv_n}{6}. We will refer to the following filter as Filter 2 (F2).

Our TIMESTAMPADD function is now fairly robust and will give us any date greater than or equal to the first day of the month from n months ago from any given date. Now we will see what we just created in action by creating date ranges to allow for a Year over Year analysis for any number of months. Consider the following filter set:

This may appear to be pretty intimidating at first but if we break it into parts we can start to understand its purpose. Notice we are using the exact same filters from before; Filter 1 and Filter 2.   What we have done here is filtered on two time periods, separated by the OR statement.

  • The first date range defines the period as being the most recent completed n months from any given prompted date value, using a presentation variable with a default of today. Dates in the current month have been removed from the set by Filter 1.
  • The second time period, after the OR statement, is the exact same as the first only it has been wrapped in another TIMESTAMP function subtracting a year, giving you the exact same time frame for the year prior.

This allows us to create a report that can run a year over year analysis for a rolling n month time frame determined by the user.

A note on nested TIMESTAMPS: you will always want to create nested TIMESTAMPS with the smallest interval first. Then you will wrap intervals as necessary. In this case our smallest increment is day, wrapped by month, wrapped by year.

Let’s Go Crazy

A more advanced trick, If you use real time or near real time reporting: using CURRENT_DATE may be how you want to proceed. Otherwise, instead of using today as your default date value, use yesterday’s date since most data are only as current as yesterday.  Using yesterday will be valuable especially when pulling reports on the first day of the month or year - you generally want the entire previous time period rather than the empty beginning of a new one.  So, to implement, wherever you have @{pDate}{CURRENT_DATE} replace it with @{pDate}{TIMESTAMPADD(SQL_TSI_DAY,-1,CURRENT_DATE)}

One more change on our filter to make it extra-flexible here is to use a new presentation variable to determine if you want to display year over year values, or by month, or by week. This can be done by inserting a variable into your SQL_TSI_MONTH and DAYOFMONTH statements; changing MONTH to SQL_TSI_@{pv_INT}{MONTH} and DAYOF@{pv_INT}{MONTH}, where pv_INT is the name of our variable.

Start by creating a dummy variable in your prompt to allow users to select either MONTH or WEEK.  You can try something like this: CASE MOD(DAY("Time"."Date"),2) WHEN 0 'WEEK' WHEN 1 THEN 'MONTH' END

The updated filter now look like this:

In order for our interaction between Month and Week to run smoothly we have to factor in one last consideration: if we are to take the date December 1st, 2019 and subtract one year we get December 1st, 2018.  However, if we take the first day of this week, Sunday December 15, 2019 and subtract one year we get Saturday December 15, 2014.  In our analysis this will cause an extra partial week to show up for prior years.  To get around this we will add a case statement determining if '@{pv_INT}{MONTH}' = 'Week' THEN subtract 52 weeks from the first of the week ELSE subtract 1 year from the first of the month. With this, our final filter set will look like this:

With the use of these filters and some creative dashboarding, you can construct a report that easily allows you to view a year over year analysis from any date in time for any number of periods either by month or by week.

Filtered by Week intervals;

The formula below will give you the value of period rolling to use in the analysis;

In this post, we created a cloud version of the amazing demo previously described by Brian Hall.  As demonstrated, Timestamp functions and their power have been within the interesting topics of the visualisation and reporting for as long as we at Rittman Mead remember and can still be used in the realm of the Oracle Cloud Services in a very similar way as the past.

Feel free to get in touch, let us know your reviews and comments.