Liberate your data

Intelligence is all about knowledge. This website is dedicated sharing expertise on Oracle BI. More »

 

7 Signs Your Company Needs Database Consulting Services

Today, most business leaders know that leveraging data drives business results. Maintaining, organizing, and securing data is becoming more difficult, but it is also more important than ever before. The number of data sources is multiplying and there’s a greater need to integrate that data to create a picture of reality that will help grow a business. As a result, many companies are using database consulting services to help them meet this challenge.

The question many business leaders are struggling with is, “Should we outsource our database administration?” If you’re asking yourself that question, consider the warning signs that indicate your company can benefit from database consulting services.

What is Database Consulting?

Database consulting can take many forms. You might hire a database consultant to supplement your existing staff in completing daily maintenance tasks. In some situations, the database consultant will perform high-level tasks that require a database administrator (DBA) with a long history of designing and planning database architecture.

The best firms provide a wide range of database consulting services, depending on your needs. They should be able to provide DBAs with different areas of expertise and provide coverage at different times of day to help you cover a 24x7 operation.

When to Consider Seeking Out Database Consulting Services

  1. Your database to do list is growing rapidly

    Customer service wants to add a new column to their database, the warehouse wants to update the inventory tracking system, and the senior executives want a new way to gather business intelligence. If you’re buried in new requests and don’t have the staff to satisfy everyone in a reasonable timeframe, you know you could use additional database support power.

    When you acquire database consulting services, you’ll get the resources you need to keep your systems running at peak efficiency and you’ll also have more band width to address a growing to do list.

  2. You’re having problems finding qualified staff

    A February 2020 survey found that 69 percent of employers were having problems finding qualified staff. That statistic has tripled over the last 10 years. At the end of 2019, other experts reported that the most difficult people to find are security analysts, data research scientists, and database administrators. There may be more DBA candidates in the job market after the COVID-19 pandemic, but many companies aren’t in a position to increase their staffing.

    And, what will you do when your Mainframe DBA retires? A technology with a long history typically has experienced DBAs managing it who are planning to retire in the near future. If you’re having trouble finding qualified staff, the problem is only going to get worse. Working with a database consulting firm offers you the expertise you need at a cost much lower than bringing on a new hire.

  3. You need to address innovative projects, but you don’t have the expertise in-house

    Many projects that take advantage of leading-edge technology also need highly experienced DBAs. If you don’t have the expertise in-house, you may consider hiring. But, if you don’t have the work to keep a DBA busy on a regular basis, hiring may not be your best alternative.

    This is another perfect time to hire a database consulting firm that can provide you with the specific expertise you need to complete special projects.

  4. Your best DBAs are busy with maintenance tasks

    Some companies run into the problem of having an experienced DBA on staff, but that person ends up spending much of their time putting out fires, deploying changes and doing maintenance tasks. It’s easy to fall into that situation because it’s usually the result of a loop that’s difficult to get out of.

    The loop starts when your DBA must increase the number of low-level tasks they need to address. Then, the database stops running as effectively because the DBA doesn’t have time to do things such as performance tuning, index optimization and capacity planning.. Therefore, the DBA has to do more reactive types of tasks and that delays getting to the proactive work, which causes more maintenance work.

    Database consulting firms offer DBAs at various levels of expertise. If you get additional manpower to handle maintenance, you won’t need to pay for a senior DBA, and it will free up your own staff members to do more value-added work. In addition, your entire organization will benefit from systems that run efficiently.

  5. You’re losing experienced DBAs who want more challenge at work

    Most experienced DBAs want to do the type of work that will challenge them and let them demonstrate their value to the organization. If your experienced DBA is stuck doing low-level work, such as reacting to backup failures and other mundane tasks, they will often decide to seek out other opportunities.

    In this situation, you have two problems. First, you need to hire a less-experienced administrator to do maintenance. Second, you need to hire a replacement for the experienced DBA who has left the organization. A database consulting partner can help you solve both of those problems.

  6. You’re afraid you’ll lose valued employees who are tired of putting out fires in the middle of the night

    If you don’t have experienced DBAs to work all shifts, your DBAs are going to be roused in the middle of the night when there’s a problem that your people at work can’t fix. With the shortage of talent in the industry, DBAs often know they can find another job where they won’t get the midnight emergency calls.

    Partnering with a database consulting firm can provide 24x7x365 support for your systems and let your DBAs avoid falling asleep at their desks and looking for other opportunities.

  7. You need to increase productivity

    You’ve made a significant investment in your company’s mission-critical systems. If they aren’t running well, you’re not getting a return on your investment. If your staff doesn’t have the knowledge and skills to manage the network, or if your IT staff is overworked, your systems are vulnerable to inefficiency and unreliability. That’s not only a burden for IT. The entire business suffers.

    Further, cloud technology and the digital workplace are going to provide a wide range of capabilities that will keep your business competitive. As the DBA role expands, they’ll be the people who know what data you have, and you’ll need them to help you make sense of it and develop the applications you’ll need to utilize it.

    Bringing in the right level of expertise from a database consulting service is the key to making sure that you meet your business growth goals without breaking the budget.

Get Customized Database Support and Consulting Services

Datavail can offer you database consulting services that we customize for your specific needs. We offer flexibility in terms of the level of expertise you need, and the service delivery alternative that solves your problems.

You may need a DBA, architect, database developer, or someone to take over your low-level database tasks. We can provide consultants with various levels of expertise for a short-term project, a project of undefined duration, or consultants who can integrate into your staff for a long-term project. In addition, our consultants can work onsite or offsite, and are located onshore and offshore. Offshore consultants are particularly beneficial when you need to staff second or third shifts.

For more information, visit our Database Consulting solutions page.

The post 7 Signs Your Company Needs Database Consulting Services appeared first on Datavail.

Using the Go-URL in OAS

The OBIEE Go-URL functionality is widely used and it's deprecation in OAS is causing issues for companies wishing to upgrade to OAS.

This is a feature of OBIEE that allows a report to be accessed directly from a URL, enabling links from other applications - often with optional parameters passed through to filter the report - and usually bypassing the login screen by including user credentials within the URL parameters.  

Typically a single user-id is used for Go-URL access to OBIEE reports thus avoiding the need to either enter/detect user details or to provide login credentials for users who would not login to OBIEE directly.

With the first release of OAS, Oracle have announced that the Go URL functionality is being deprecated and may be removed in a future release. Also, since OBIEE 12.2.1.3, the advent of the  combined bi security login screen for both Analytics and DV has meant the NQUser and NQPassword parameters no longer work, so it is not possible to specify a user for the Go URL and hence it cannot login.

There are currently two scenarios under which the Go-URL does work in OAS:

  1. If full Single Sign-on to OAS has been implemented - via Oracle Access Manager or Apache, currently the only recognised methods for SSO in OAS - then the Go-URL will work under the user-id that SSO signs in under. In other words, the user credential parameters in the URL are ignored, the user logs in automatically under their credentials and the requested report displays as normal.

    However this is often not appropriate - if users are clicking on Go-URL links outside the scope of the SSO environment, for instance connecting via the internet or from other networks within the company LAN where SSO could not operate. Also if reports are designed to work under a specific user, perhaps with special security rules, a users own security levels may result in different data appearing in a report.

  1. Disable the lightweight SSO option in OAS. This separates the security for Analytics and DV. For Analytics this returns to the classic login screen and the NQUser and NQPassword parameters on the Go-URL work correctly, but this causes issues for DV making it virtually unusable. This is only really an option if you do not want to use DV in your implementation of OAS.

There is however a third option we have discovered which uses the Single Sign-On approach, but without the need to actually detect user credentials via an SSO service. Instead a specific user-id can be provided, just like the NQUser parameter in the Go URL.

Instead of the Go URL passing the NQUser and NQPassword parameters, Apache can be used to re-direct access to OAS, providing the user-id to login with during the re-direct.  The effect is that the Go URL will still display the same report output whilst logging in with the same user-id, albeit supplied by Apache rather than the NQUser parameter.

This works by using the Single-Sign-On configuration between Apache and OAS.

In a normal SSO configuration, Apache would authenticate users via a standard SSO service, for example, Kerberos, and once the user has passed authentication, redirect the URL to OAS, providing the user credentials as request-headers, so that OAS can automatically login the user in, bypassing the login screen.

The Oracle document ID 2707401.1 explains this process in detail.

We can however ignore the SSO authentication within Apache and instead hard-code the User-id we want to login to OAS with.  Apache will still send this user-id in the request-header to OAS, which in turn will recognise it as a pre-authenticated user and log the session in.  In the case of a Go URL, the rest of the URL which points to a report or dashboard will continue to process as normal, displaying it to the user.

The user is totally unaware of the difference - they click on a link and see a report in OAS just as they did in OBIEE.

Just as with the Go URL there are security implications of this. The user-id and password are no longer sent in plain text in the URL, which is an improvement, but calling the URL will automatically login you into OAS. if you just use the core URL of http://oas-server:port/analytics, you will be logged in to OAS and end up at the home screen, without any user/password challenge.

It is important therefore to restrict the permissions and privileges of this hard-coded user so it can only access the reports/data it needs to and cannot create new content, access other subject areas etc.  Using the Apache configuration, OAS can be tied down so that just the /analytics URL can be opened for this, so no access can be given to /dv, /xmlpserver or any other area of OAS unless explicitly required.

By using Apache to control access to OAS you create a separate port to call OAS on for the Go-URL. This means the port you installed OAS with (e.g. 9502/9503) will remain as the main port for logging into OAS via the login screen - users manually connecting to OAS for analytics, DV and all the other functionality, should continue to use this port.   These connections will not go via Apache.

The Go-URL will use the port provided by Apache - in the instructions below I’ve set this up as port 9603 - and Apache will re-direct the user seamlessly to OAS’s main port.   Apache can also be configured to use SSL certificates, and if installed on the same server as OAS, the same certificates as OAS.  In this example I’m not using SSL, but the instructions for doing so are included.

Assuming you have already OAS installed, below are the steps to install and configure Apache to act as the “SSO” front-end for the Go-URL.   The instructions below were carried out on an Oracle Enterprise Linux 7.8 server - other versions of Linux or using Windows will be slightly different.

Start by installing Apache HTTP Server

As the root user, use yum to download and install Apache and the required SSL and SSO plug-ins:

yum -y install httpd mod_ssl mod_auth_kerb mod_auth_token

Then enable and start Apache:

systemctl enable httpd.service systemctl start httpd.service

Next enable Apache to connect to the network and allow access on port 9603. In this case I’ve installed  policycoreutils to get the semanage command as I have SELinux enabled.

yum -y install policycoreutils-python 
/usr/sbin/setsebool -P httpd_can_network_connect 1 
semanage port -a -t http_port_t -p tcp 9603 
semanage port -l | grep http

The final command above confirms port 9603 is now available:

Next open the firewall port for 9603:

firewall-cmd --zone=public --add-port=9603/tcp --permanent service 
firewalld stop service 
firewalld start 
systemctl restart httpd.service

Now Apache is is installed, we can configure it. Edit the file /etc/httpd/conf/httpd.conf

Set the Listen port and, if SSL is required, add the SSL virtual host properties with the correct server, port and SSL certificate file values (I’ve included the SSL virtual host in httpd.conf, but it could reside in it’s usual place in ssl.conf instead)

Listen 9603 
<VirtualHost oasvm3.local.com:9603> 
ErrorLog logs/ssl_error_log 
TransferLog logs/ssl_access_log 
ServerName oasvm3.local.com 
SSLEngine on 
SSLProxyEngine on 
SSLCertificateFile /u01/oas55/ssl/certificate.crt 
SSLCertificateKeyFile /u01/oas55/ssl/certificate.key 
SSLCertificateChainFile /u01/oas55/ssl/certificate_chain.pem 
SSLCACertificateFile /u01/oas55/ssl/ca_certificate.crt
SSLProxyCACertificateFile /u01/oas55/ssl/ca_certificate.crt 
RequestHeader set WL-Proxy-SSL "true" 
RequestHeader set IS_SSL "ssl" 
RewriteEngine On 
RewriteOptions Inherit 
ProxyPreserveHost On 
</VirtualHost>

save and close, then restart apache:

systemctl restart httpd.service

Then try the Apache URL in a Browser: http://oasvm3.local.com:9603 (or https for ssl)

if using SSL check for the padlock icon in the address bar.

Configure Apache for OAS

Re-open the httpd.conf file and locate the line:

#ServerName www.example.com:80

below this add the following lines - these are typical recommended settings for OAS

###### Various default settings ###### 
# 
# Timeout: The number of seconds before receives and sends time out. 
# 
Timeout 6000 
# 
# KeepAlive: Whether or not to allow persistent connections (more than 
# one request per connection). Set to "Off" to deactivate. 
# 
KeepAlive On 
# 
# MaxKeepAliveRequests: The maximum number of requests to allow 
# during a persistent connection. Set to 0 to allow an unlimited amount. 
# We recommend you leave this number high, for maximum performance. 
# 
MaxKeepAliveRequests 0 
# 
# KeepAliveTimeout: Number of seconds to wait for the next request from
# the same client on the same connection. 
# 
KeepAliveTimeout 60 
#
#####################################

Now add the following lines to the end of the file, altering the OAS URLs appropriately. The ProxyPass URLs must be the original SSL or Non-SSL OAS URLs.

Define GoUser reportuser 

RewriteEngine On Proxy
PreserveHost On 

# Protected Resources 
<Location "/analytics"> 
ProxyPass "http://oasvm3.local.com:9502/analytics" 
ProxyPassReverse "/analytics" 
#SSLRequireSSL 
RequestHeader unset Authorization 
RequestHeader set OAM_REMOTE_USER ${GoUser} 
RequestHeader set iv-user ${GoUser} 
RequestHeader set SM_USER ${GoUser} 
</Location> 

ProxyErrorOverride Off 

# Unprotected Resources 
<Location "/bi-security-login"> 
ProxyPass "https//oasvm3.local.com:9502/bi-security-login"
ProxyPassReverse "/bi-security-login" 
</Location> 

The first line defines the user which will login to OAS, in this case I’ve created a user called ‘reportuser’. This is then substituted into the request headers below.

Ideally this user should exist in the provider used for authentication in Weblogic - so that any security groups can be picked up to control this user's access with OAS.

Note the SSLRequireSSL is hashed out, the hash must be removed if SSL is required.

save and close, then restart Apache:

systemctl restart httpd.service

Configure Weblogic to accept SSO calls from Apache

To enable Weblogic to recognise SSO tokens to Apache we use the OAM Identity Asserter.

Login to the Weblogic admin console, Navigate to Security Realms → myrealm → Providers, then Click Lock & Edit, then click New.

Click New.  Enter a name and choose OAMIdentityAsserter as the type.

Click OK to add the provider:

Click on the new OAM Asserter provider to edit it. Change the Control Flag to REQUIRED & click Save. The remaining default settings are fine.

Return to the providers screen, click Reorder and move the OAM Asserter so it is second in the list below the main user directory, in this case PaulsAD is the main active directory authenticator

This next step isn’t strictly necessary - the Weblogic Plug-in is required to allow access to Data Visualizer and if you have no intention of allowing access to DV via Apache, this step can be skipped.

The Weblogic Plug-in needs enabling in three locations.

From the Domain Structure window click on bi

Go to the Web Applications tab

Scroll down to the WebLogic Plugin Enabled option and check it. then click Save.

From the Domain Structure window expand Environment and click on Servers. Click on bi_server1 in the Configuration tab.

Scroll down the General, click on the Advanced link, then locate the WebLogic Plug-In Enabled option and set it to ‘yes’. Click save.

From the Domain Structure window click on Clusters. Click on bi_cluster

Again in the General tab click on Advanced then set the WebLogic Plug-In Enabled option to ‘yes’. Click save.

All changes are now complete. Click Activate Changes.

A full stop and restart of Weblogic & OAS is required under the oracle user:

/u01/oas55/config/domains/bi/bitools/bin/stop.sh    
/u01/oas55/config/domains/bi/bitools/bin/start.sh

Enable Lightweight SSO

For this method of accessing OAS to work, the internal lightweight SSO must be enabled. It will be enabled by default, but if it has been disabled, for example to make the Go-URL work, then re-enable it:

Stop OAS, open the WLST command line:

cd /u01/oas55/product/oracle_common/common/bin ./wlst.sh

Re-enable SSO (alter the domain path to suit your environment)

wls:/offline> enableBISingleSignOn('/u01/oas55/config/domains/bi','/bi-security-login/logout?redirect=/dv'); 
wls:/offline> exit();

Then restart OBIEE to reflect the changes.

The Apache configuration is now complete and you should be able to login to OAS on the Apache URL, e.g. http://oasvm3.local.com:9603/analytics

The SSO configuration should automatically log you in under the user defined in the apache configuration above:

The Original OAS URL remains on port 9503 and will take you to the normal login screen:

Note than when you login via the Apache you can’t sign-out!  You can click the sign-out link, but the SSO process will simply login you back in again rather than displaying the login screen. To login normally close the browser and reopen with the standard OAS URL.

Apache Log files

If you have any issues with the above, there are five log files for Apache you can use investigate whats going on.

/var/log/audit/audit.log Contains SELinux issues
/var/log/messages Contains general activity messages
/var/log/httpd/error_log Contains errors generated by Apache,including some SSL and Kerberos messages
/var/log/httpd/ssl_error_log Contains SSL errors generated by Apache
/var/log/httpd/ssl_access_log Contains messages related to users connecting over SSL

The logging level in the error_log file is governed by the LogLevel setting in /etc/httpd/conf/httpd.conf This is normally be set to ‘warn’, but can be set to debug to display detailed messages.

Testing the Go URL

With Apache/SSO now enabled, the Go URL can be used:

http://oasvm3.local.com:9603/analytics/saw.dll?Go&Path=%2Fshared%2FPipeline QuickAnalytics%2FPipeline%2FSales Pipeline-Charts&locale=en-en&lang=en

Note that if the Go URL includes the NQUser and NQPassword parameters, they will be ignored.

Alternatively the Dashboard parameter can be used to take a user fully into OAS to view a dashboard:

http://oasvm3.local.com:9603/analytics/saw.dll?dashboard&PortalPath=%2Fshared%2FPipeline QuickAnalytics%2F_portal%2FSales Effectiveness

An important point here is that the user is now logged into OAS and can access other content. As mentioned earlier, the user used for Go URL access should have their permissions and privileges tightly controlled to limit access to just what it required.

Accessing other areas of OAS

The instructions above only provide access to the core /analytics functionality via the Apache SSO URL. Other areas of OAS, such as /dv or /xmlpserver are not enabled:

If these are required the location entries must be added to the httpd.conf file and Apache restarted.

The full list locations can be found in the Oracle doc mentioned earlier, and includes protected and unprotected locations (as some must be visible prior to logging into OAS, e.g. the bi-security-login URL). As an example, here is the location required to enable DV:

<Location "/dv"> 
ProxyPass "http://oasvm3.local.com:9502/dv" 
ProxyPassReverse "/dv" 
#SSLRequireSSL 
RequestHeader unset Authorization 
RequestHeader set OAM_REMOTE_USER 
${GoUser} RequestHeader set iv-user 
${GoUser} RequestHeader set SM_USER 
${GoUser} 
</Location>

Thats it, your now ready to use the Go-URL with the Apache URL and not include the NQUser/NQPassword parameters.

This solution is not ideal and does have security implications you must take care of, although it does avoid the user/password appearing in the URL, which is a definite positive.

One downside of this is that you can’t use multiple user-ids - all the Go URL’s will connect with the same user, it is not possible (well we’ve not found a way yet) to allow different Go URLs to connect with different user-ids. Apache will only allow one redirect to /analytics on the main listen port.

Enjoy!

Harnessing Oracle’s ADW with a Dedicated DBA

At only two years old, Oracle’s Autonomous Data Warehouse (ADW) continues to reward early adopters with faster deployments for far less money than they were paying before. This family of self-driving, -repairing, and -securing cloud services leverages automation and machine learning to speed processing, eliminate errors, and relieve human effort to focus on more critical tasks.

And far from reducing the need for a Database Administrator (DBA), the ADW provides tools and capacities that elevate the value of a dedicated DBA far beyond their traditional role.

Here are three reasons why Oracle’s ADW and your DBA are (or should be) your most valuable assets.

Comprehensive Connections

The work of an ADW DBA is similar to that of a CEO – they know every aspect of the database and its abilities just as a CEO knows every aspect of the corporation and its strategies. Drawing from this wealth of knowledge, DBAs can add color and depth to every organizational discussion, from security to performance to migrations and more.

Creativity

Because so many traditional DBA functions are now automated, the DBA’s full skill base is released for higher, more valuable purposes.

  • They can collaborate with team members to get relevant data where it needs to be when it needs to be there. Traditional database management systems often required a unique team simply to build and maintain the complex information systems demanded by enterprise computing processes. With most of those functions automated, the DBA can help developers find and integrate the specific data materials needed for specific applications, and then structure their provisioning appropriately.
  • Using their over-arching comprehension of the range and depth of corporate data stores, today’s DBA can pull relevant information to facilitate data modeling options that solve problems based on actual enterprise information. Insights will not only help leadership make better decisions, but also see new configurations and opportunities that had previously been buried in company data coffers.
  • Their ability to analyze complex corporate concerns draws on a full understanding of the organization’s foundational information infrastructure. Compiling data from across the ADW and using cutting edge analytics tools, the DBA helps to reveal undiscovered patterns in the data that suggest possible trends and potential responses.

Confidence

With its forward-thinking, all-inclusive automation, Oracle’s ADW offers its users confidence on several levels:

  • That their database will be optimized at – literally – all times. Patching gaffs and integrating new programming using state-of-the-art artificial intelligence and machine learning ensures both legacy and new features will deploy and function correctly and on time, every time.
  • That their critical corporate information is safe and secure at all times. In many cases, human error accounts for data security failures; the ADW eliminates the human interventions that cause these breaches, so leaders can focus on how their organization is working, not on whether its assets are secured.
  • That their human assets are optimized as well. With a skilled DBA at the helm, DB users have the tools to understand more and deeper aspects of their information stores, facilitating a better understanding of corporate capacities and opportunities.

 
From a cost perspective, the ADW DBA also enhances corporate values. The role ensures that investment in data collection, management, and deployment strategies is maximized; why invest if you don’t gain all available benefits? The role also maximizes the value of the human DBA.

By eliminating their mundane data management and maintenance chores, your DBA is free to exploit and harness previously hidden assets and opportunities. Your investment in the DBA role enhances your investment in your ADW.

The Oracle ADW provides users with unparalleled data performance, facilitating elastic data management and enabling data deployment wherever and whenever it is needed. Harnessing the service with a dedicated ADW DBA ensures that your information infrastructure is optimized so your enterprise can utilize its maximum values. Contact us to learn more about how our Oracle experts can support your business goals.

Further Reading

DBAs Maximize Autonomous Database Investments
Download our paper to learn more about how DBAs go beyond just maintaining an operational database, how automation will improve digital performance and enhance the value of DBAs (not kill the DBA role) – to grow your organization’s market share.

Foundation Optimized: Oracle’s Cloud Infrastructure
Download our white paper to uncover how Oracle’s tech engineers built OCI with best practices and how it integrates services to meet organizations’ needs.

The post Harnessing Oracle’s ADW with a Dedicated DBA appeared first on Datavail.

Database Scoped Configuration on SQL Server 2016

SQL Server 2016 has released several new features that have enhanced functionality and flexibility to increase database performance.

I’ve been working as a production SQL Server DBA for 12+ years and doing health check analyses on countless SQL servers; I noticed that there were several occasions when my analysis report came up with MAXDOP recommendation as one of the critical suggestions. However, almost half of the time we had to rule it out from our list just because an application connecting to that database only allowed MAXDOP=1 setting. Most of these applications were legacy applications – which don’t give us room to take advantage of parallelism or parallel query execution.

I’ve also seen that to reduce cost on resources, many organizations go for hosting/consolidating multiple databases on a single machine. And, if you have any application connecting to any one of those databases and has SQL Server MAXDOP=1 limitation, it prevents other databases on that same SQL instance from utilizing parallelism; and overall server performance suffers, since MAXDOP setting affects the entire instance.

Microsoft finally addressed that issue and introduced database level configuration options in SQL Server 2016. With these new configuration options on databases property, now you can configure MAXDOP value on database level rather than entire instance level. Now you can have multiple databases on single instance but with different MAXDOP value on each of them and utilize parallelism in an efficient way.

You can see these databases scoped configuration values on option tab under database property:
 

 

You can either configure it using GUI or using below T-sql command:

Use <DatabaseName>
GO
ALTER DATABASE SCOPED CONFIGURATION SET MAXDOP = 8
GO

You can also view these configuration values by using the below dynamic management view

SELECT * FROM  sys.database_scoped_configurations;

As shown in above image there are four configurations setting available on database level:

  1. Max DOP. Using this option you can enable parallelism on databases’ level. Depending on number of logical processors, you can specify numeric value on this option and that allows SQL Server to use that many processors to execute queries hitting to that specific database in parallel processing.
  2. LEGACY CARDINALITY ESTIMATION. SQL Server uses this option to generate the best possible execution plan by estimating the number of rows specific queries are going to return. Now you can turn this option on or off on database level.
  3. Parameter sniffing. SQL Server prepares the execution plan by evaluating parameter passed to the stored procedure and uses the same plan for that stored procedure’s consecutive runs, or other procedures which are the same in nature or form. This way it tries to improve performance by using same plan and avoiding compilation time.

    However, it does create a problem for some complex procedures or queries that would require a more optimal plan than the one created based on one parameter value. In SQL 2016, you may disable this option on database level and have SQL use query optimizer as a plan for using more current SQL statistics.

  4. Query Optimizer Fixes. If you turn on this option on database, SQL allows query optimizer for all the queries on that database to use all the features that came with latest SQL hotfix or patch applied on that SQL instance.

    This was not the case on earlier SQL version. On earlier versions of SQL 2016, query optimizer was not able to use all the latest fixes immediately that come with SQL hotfix to avoid situations where some queries may perform badly with new changes.

    However, you were still able to do so by turning on trace flag 4199 and that enables optimizer for all the queries on all the databases on SQL instance. But now you can enable this feature on database level without affecting queries on other databases on same instance.

    One thing to mention here, this feature is only useful if you are running on SQL 2016, but database compatibility level is set to below 130 (SQL 2016). If you have databases with compatibility with 130 then enabling or disabling this option is not much of use as with 130 compatibility level. SQL by default allows optimizer to use the latest changes that SQL hotfix offers.

  5. CLEAR PROCEDURE CACHE. This option is not available on GUI (SSMS), but you can configure it using ALTER DATABSE command on database level. As the name suggests with this option you can clear procedure cache on database level now. I can’t remember how many times I have had to execute DBCC FREEPROCCACHE while working store procedure or query tuning work. With this option, I can do it on database level without affecting other databases proc cache on same SQL instance.

 

Another fantastic thing about these options is, if you are having databases configured on SQL Always on, it allows you to set different values for these options on Primary and that onto Secondary server.

For example, you can set MAXDOP=1 for database on primary server and keep MAXDOP=3 for same database on secondary server on SQL Always On as shown below:
 

 

You can achieve the same thing by running below queries:

ALTER DATABASE SCOPED CONFIGURATION SET MAXDOP =1;
GO
ALTER DATABASE SCOPED CONFIGURATION FOR SECONDARY SET MAXDOP=3;
GO

Nowadays, many people/organizations configure readable secondary replica on SQL Always on to offload all read only workload. This feature is very helpful where you can set MAXDOP=1 for database on primary replica, mostly because of application demand and at the same time you set MAXDOP=3 for same database on secondary replica and achieve better performance for all read only work or queries with parallelism enabled on it.

Here are some things to keep in mind or consider when you use these options:

  • ALTER ANY DATABASE SCOPED CONFIGURATION permission is required in order to enable these database options.
  • You can set up different scoped option on database on secondary replica from primary, but all secondary databases should have same configuration values, it does not allow different values on individual secondary databases.
  • If you are using Query hint in SQL statement, then it overrides database scoped configuration.
  • If you are running queries under context of different database, as shown below:
     
    Use Tempdb
    go
    select * from DBA.dbo.testtable
    go

    then these new database setting does not work. If you have set up MAXDOP=2 on DBA database, then the above query won’t be able to use parallelism. You must run it under context of DBA database only; as shown below:

    Use DBA
    go
    select * from DBA.dbo.testtable
    go

 

SQL Server continues to evolve as a database leader with new features to streamline processes for DBAs. If you’re looking for support with SQL, please reach out, our experts are standing by.

The post Database Scoped Configuration on SQL Server 2016 appeared first on Datavail.

Analysing Social Media Activity with ADW and OAC

Analysing Social Media Activity with ADW and OAC

Yesterday I wrote a series of tweets talking about my Twitter activity analysis with Oracle's Autonomous Database and Oracle Analytics Cloud. Since the tweet became popular and I didn't share the steps, I thought a blog post should follow, so... here we are!

Getting the Data from Twitter

There are many ways that you can get data from Twitter. For the sake of my exercise I requested a dump from my data from the Twitter website. You can request the same following these instructions.

Once the data is ready, you'll receive a mail or a notification and you'll be able to download the related zip file named twitter-YYYY-MM-DD-hashed_string.zip.

Once unzipped you'll see two folders:

  • assets containing files you attached in your tweets (mostly images)
  • data where the actual interesting information is.
Analysing Social Media Activity with ADW and OAC

Within the data folder I concentrated on the tweet.js file which contains the tweet information. There are other interesting files such as followers.js or like.js but I didn't include those datasets in my analysis as of now.

Warning: the tweet.js dataset contains only the tweets written by you. It doesn't contain all people tagging you in a tweet or replies to your tweets.

The Your archive.html file allows you to browse the content on the folder from a web browser

Analysing Social Media Activity with ADW and OAC

Importing Data into ADW with SQL Developer Web

The next step in my analysis is to import the data into the Autonomous Data Warehouse. For this task, I used SQLDeveloper Web, available by default in ADW, which has a powerful utility to import JSON documents as rows in a relational table. Jeff Smith's post covers the process in detail.

Unfortunately when trying to import into ADW the file tweet.js I encountered an error due to the fact that the file itself is not a pure JSON file, pretty clear if you check the file itself

window.YTD.tweet.part0 = [ {
  "tweet" : {
    "retweeted" : false,
    "source" : "<a href=\"http://twitter.com/download/iphone\" rel=\"nofollow\">Twitter for iPhone</a>",
    .....
    "lang" : "en"
  }
},
....
{
  "tweet" : {
    "retweeted" : false,
    "source" : "<a href=\"http://twitter.com/download/iphone\" rel=\"nofollow\">Twitter for iPhone</a>",
    .....
    "lang" : "en"
  }
 }
 ]

The first item to remove from our file is the window.YTD.tweet.part0 = prefix. I believe this is due to the pagination of the results, but it clearly screws up the JSON formatting.

Once removed I could parse the tweet.js file with SQLDeveloper Web, but the table definition proposed had only one column with containing the whole TWEET  JSON document.

Analysing Social Media Activity with ADW and OAC

At this point, I could either accept this and do further processing using Oracle's SQL JSON parsing functionality or slightly change the shape of the file to be ingested correctly and I opted for the second.

When importing JSON documents into rows, SQLDeveloper Web analyses only the first level or attributes in the document itself. In the case of our tweet.js file was something like

{
  "tweet" : {
    ...
    }
},
{
  "tweet" : {
    ...
    }
},

The first level parsed by SQL Developer Web was correctly only extracting the tweet element and proposing a CLOB (JSON) column to store it. But I wanted the content of the tweet to be parsed. I ended up removing the first layer by substituting in the file any occurrence of },{ "tweet" : { with a simple comma and removing the initial and final parenthesis.

The file now looks like the following

[ {
    "id" : "279908827007180800",
    "created_at" : "Sat Dec 15 11:20:39 +0000 2012",
    "full_text" : "Finally at home after #christmasparty... Looooong travel!",
    "lang" : "en"
    ...
  }
,{
    "id" : "276794944394498048",
    "created_at" : "Thu Dec 06 21:07:12 +0000 2012",
    "full_text" : "@mRainey will you be there next week too? Enjoy uk and visit #italy if you can!",
    "lang" : "en",
    ...
  }
 ...
 }]

We can now parse the file with SQL Developer Web, and the output correctly identifies all the first level entities in the JSON document.

Analysing Social Media Activity with ADW and OAC

Few more clicks and we have our table FRANCESCO.TWEET_DATA_SQLDEV populated automagically!

Analysing Social Media Activity with ADW and OAC

Note: The whole process above could be implemented and automated in several different ways, the aim of the blog post is only to demonstrate the feasibility of the analysis.

Text Tokenization with Oracle Text

The Tweet's FULL_TEXT column tokenization is done in the Oracle Database, you can see the full process described by the following video produced by the Oracle Analytics PM team.

If you prefer a short version of it, here it is: I basically created the following index

CREATE INDEX "FRANCESCO"."TWITTER_TWEET_IDX" ON "FRANCESCO"."TWEET_DATA_SQLDEV" ("FULL_TEXT") INDEXTYPE IS "CTXSYS"."CONTEXT";

The INDEXTYPE IS "CTXSYS"."CONTEXT" creates an index using Oracle Text. A more accurate description of the procedure can be found in OracleBase's post.

Once the index is created, we can see some new tables appearing with the name DR$INDEX_NAME$LETTER with

  • $DR$ being a fixed prefix
  • INDEX_NAME the name of the index
  • LETTER a single letter between I, K, N, U, R which meaning can be found in this Document

For the purpose of our analysis, we'll focus on the DR$TWITTER_TWEET_IDX$I table which contains the tokens of our FULL_TEXT column.

But the token by itself is not very useful, we need to match the token with the Tweet's ID to be able to provide meaningful analysis. Again, this is covered nicely by another video created by the Oracle Analytics PM team.

In order to associate the Token with the original Tweet we can use again the power of Oracle Text and the Index created above with the following query

SELECT
    full_text,
    score(1) AS text_score,
    token_text,
    a.id
FROM
    tweet_data_sqldev a,
    (
        SELECT DISTINCT
            token_text
        FROM
            dr$twitter_tweet_idx$i
    )
WHERE
    contains(a.FULL_TEXT, '/' || token_text, 1) > 0

Again for more info about Oracle Text's CONTAINS function please refer to the relevant documentation.

I physicalized the output of the above query in a table ( TWEET_TOKENS), which contains the TOKEN_TEXT together with the Tweet's ID so we can now join this table with the original one containing the list of Tweets in Oracle Analytics Cloud.

Note: One of the next versions of Oracle Analytics Cloud will provide the Tokenization as step of a DataFlow within the Database Analytics options! You'll be able to tokenize your strings without leaving OAC.

Analysing Social Media Activity with ADW and OAC

Analysis of the Data in Oracle Analytics Cloud

If you're a frequent blog reader, this is probably the easiest part. I just had to:

  • Create a connection to my Autonomous DataWarehouse by using the wallet file.
  • Create the two datasources: one for TWEET_DATA_SQLDEV and another for TWEET_TOKENS
  • Create a project and include both Datasources

Once in the project, the first visualization is about the most "liked" tweet... no surprise is when I become Oracle Ace Director, back in 2019, during KScope Keynote!

Analysing Social Media Activity with ADW and OAC

This first visualization is ok, but not really using any of the Oracle Text capabilities exposed in the Tokens... so my next analysis was...

Which words do I use more often when tweeting?

Easy... with the two datasources created above!

Analysing Social Media Activity with ADW and OAC
How does it change when I reply to people?

Well, you can see that words about OUG sessions and Oracle are still there, but there is the emerging topic of Food!

Analysing Social Media Activity with ADW and OAC
Who do I interact most with?

I plotted the # of Replies to specific Users…
No wonder (at least to me) that I get the most interactions with people that tag me with Italian food abominations…

Analysing Social Media Activity with ADW and OAC
And... how do I reply to them?

Again, very easy with Oracle Analytics Cloud Trellis Options.
You can spot that Food is the major discussion topic with HeliFromFinland and dw_pete, while my chats with @connor_mc_d and @stewartbryson are covering more topics

Analysing Social Media Activity with ADW and OAC
What about Nephentur?

One Last mention to my interactions with Nephentur: It’s clear his love for OUGs, Airports, Flights, Hotels… all driven by  #TheBergEffect

Analysing Social Media Activity with ADW and OAC

Hope you liked the story, just an example of what you can do with a dataset you own and tools available in the Oracle Cloud!