Top Tips for Agile Database Design

When creating an application or a database, it is crucial to take the way you approach agile database design into consideration from the outset in order to avoid issues down the road. Because change is bound to happen it is necessary to create a system that same as application source code will allow this to occur as smoothly as possible.

In order to have a scalable application it is important to have a good database design because the data model and the database design define the structure of the database, the relationships between the data sets, and is part of the foundation on which the application is built.

Application Source Code

It must be noted that database design is not the same as application source code. While both can act as blueprints for something that will be built and both can be changed over time, when source code of an application changes the entire application is rebuilt.

DevOps-database-solutions

The existing instances are not simply modified, but are created anew. In contrast, database changes are applied directly to the instance. For this reason, application source code editing and maintenance methods will not work for a database design.

Likewise, while application source code only cares about the final state it ends up in, in database design each individual change must be considered, and each change must be noted. Every change to the database needs to be recorded individually, so that they can be applied to other instances of that database, and the final state of the database design itself should be recorded in the database model.

agile database design image.jpg

Furthermore, the only way changes should be made to an existing database instance is by changing the specific SQL statements. Changes cannot be partially implemented, meaning that it either is completed successfully or not at all and should be executed in the correct sequence for repeatability on behalf of any dependencies that exist between the changes.

Best Practices

It is a best practice that anyone who wants to make a change to the agile database design must first file a request and these should be recorded somewhere to provide an audit trail. Changes should then be reviewed and approved by the data modeller, to ensure a good, scalable design.

The data modeller will also be responsible for updating the database model, which is the record of the database design, and producing the SQL statement that implements that change.

The above role can also get done through automation, causing the role of the database designer or data modeller to only be a full-time job during the first stage of development.

While all development cycles will involve the database designer, the role itself could be filled by someone else within the development cycle, like a developer or database administrator, because it is no longer as time consuming as it was during the outset.

Each database change should be given its own unique identifier, which identifies the particular change, regardless of the database instances it is applied to, or which versions of the application it appears in.

Image2 Agile Database Design.jpg

Each change to a particular database instance must be recorded within the database instance itself in order to avoid applying the same changes twice. This implies that you maintain some form of change history table within each database instance, which should include the date and time each change occurred.

The set of changes should be stored in a single change log, which includes the change identifier and the SQL statement that implemented that change allowing for all necessary changes to be easily located and sequenced.

This change log should be well structured so that a program can read and decode the changes in it, but there should be separate instances of the database design and the change log file in each version of the application source code so that changes can be made independently.

Database Changes

SQL statements are often specific to one database product, such as Oracle, Sybase, SQL Server or MySQL. The change log will need to record the separate SQL statements for each supported database product for each database change. Thus, each change that you make will have an identifier, and an SQL statement for database product that is clearly labelled with the database product name.

This whole solution is brought together in a program that is run during an application upgrade. It opens the change log XML file and reads in each change. If that change has not yet been applied to this particular database instance, then the corresponding SQL statement is executed and a record inserted into the Change History metadata table.

The importance of a well-made agile database design cannot be understated. While there are many best practices for developers, these are some specific tips that I have found to be useful. I believe that if you follow these tips you can ensure a scalable application that will run smoothly, despite the changes.

The 6 Best Practices of Continuous Delivery Pipelining

Continuous delivery is a method that promotes the adoption of an automated deployment pipeline to quickly and reliably release software into production. This method comes from the agile school and is a natural partner to the DevOps movement.

The goal of continuous delivery pipelining is to establish an optimized end-to-end process, enhance the development to production cycles, lower the risk of release problems, and provide a quicker time to market.

Break it Down to Build it Up

In one of my previous blog posts, I listed Jez Humble’s 8 Principles of Continuous Delivery. In order to achieve the Holy Grail of an “automatic, high quality, repeatable, reliable, continuously improving process”, you must first break that process into simpler component practices.

Building the “pipeline” in this way will enable you to deal with the different stages of the process, one by one.

A deployment pipeline makes sure a change is processing in a controlled flow. The system is as follows:

  • A code check-in or configuration change triggers the flow
  • The change is compiled
  • The database changes goes through a set of tests – usually unit tests and static code analyses
  • Passing that set of test triggers automatic application tests and regression tests

After successfully passing these tests, the change can be either ready for production use, or go through additional manual and user-acceptance tests before hitting production.

The Continuous Delivery Pipelining Checklist

Achieving an efficient deployment pipeline is done by following these best practices:

  1. Build your binaries only once: Compiling code should only happen once, eliminating the risk of introducing difference due to environments, third party libraries, different compilation contexts, and configuration differences.
  2. Deploy the same way to all environments: Use the same automated release mechanism for each environment, making sure the deployment process itself is not a source of potential issues. While you deploy many times to lower environments (integration, QA, etc.) and fewer times to higher environments (pre-production and production), you can’t afford failing deployment to production because of the least tested deployment process.
  3. Smoke-Test your deployments: A non-exhaustive software test (essentially testing “everything” – services, database, messaging bus, external services, etc.) that doesn’t bother with finer details but ascertains that the most crucial functions of a program work, will give you the confidence that your application actually runs and passes basic diagnostics.

  4. Deploy into a copy of production: Create a production-like or pre-production environment, identical to production, to validate changes before pushing them to production. This will eliminate mismatches and last minute surprises. A copy of production should be as close to production as possible with regards to infrastructure, operating system, databases, patches, network topology, firewalls, and configuration.
  5. Instant propagation: The first stage should be triggered upon every check-in, and each stage should trigger the next one immediately upon successful completion. If you build code hourly, acceptance tests nightly, and load tests over the weekend, you will prevent the achievement of an efficient process and a reliable feedback loop.
  6. Stop the line: When a stage in the pipeline fails, you should automatically stop the process. Fix whatever broke, and start again from scratch before doing anything else.

The pipeline process helps establish a release mechanism that will reduce development costs, minimize risk of release failures, and allow you to practice your production releases many times before actually pushing the “release to production” button.

In Short: the Logic

Continuous improvement of the automated pipeline process will ensure that fewer and fewer holes remain, guaranteeing quality and making sure that you always retain visibility of production readiness.

Making sure your database can participate in the efficient deployment pipeline is obviously critical. However, the database requires dealing with different challenges than application code. Implementing continuous delivery for the database proves to be a challenge.

5 Must-Have Database Lifecycle Management Capabilities

As a DBA, managing your company’s valuable data is a primary concern which you must balance with your day-to-day responsibilities.

This includes continuous installation, configuration, migration, and updates to the database. Ultimately, any problems that arise with the database fall onto your shoulders – along with the responsibility to mitigate risks and implement fixes as quickly as possible.

With constant pressure for technology companies to release frequent updates to applications, overcoming common challenges in database development lifecycle management is always a concern.

Implementing DBLC Solutions

Here are 5 database lifecycle management capabilities, which offer highly efficient ways to manage the database schema, data, and metadata for a database application.

db-automation

By implementing these solutions, you will spend less time retracing steps or wasting time tracking down information and documentation, which are unnecessary and expensive.

1.  Effective Risk Mitigation

Three-way Analysis mechanism highlights database conflicts, enabling you to detect and respond to requested changes and to protect corporate information. This analysis provides a valuable alternative to damage control, comparing objects between environments and generating the proper commands to reflect the change.

A three-way, Baseline-Aware Analysis, goes beyond the capabilities of standard compare-and-sync tools. It identifies not only that configuration drift exists, but where the changes originated and whether they’re safe to deploy. This mitigates one of the most prominent risks faced by DBA Managers attempting to implement continuous delivery for the database.

This allows you to save a snapshot of the database schemas and lookup content before and after successfully rolling out any version. These snapshots can be used as a golden copy and act as a baseline when generating the next release’s database build script.

2.  Business Agility Made Possible with a Deployment Manager Wizard

DBAs should automatically generate the build script (deploy, promote, or upgrade script) based on Change Requests (CR) or business requirements.

This enables anyone to generate the database build script with a few mouse clicks and select the objects being analyzed based on several different criteria, such as: application versions, labels, object type, object name, Change Request (CR), related to business requirement requests, and so on.

By creating the database build script, you can drastically reduce the preparation phase in the release cycle and ship out rapid changes to satisfy the ever-increasing demands for leaner, more agile development cycles.

This business agility is often sought after but rarely achievable when it comes to database development due to the challenges faced with continuous delivery for the database. The preparation phase should be reduced from days and weeks to just minutes and hours.

3.  Improve Development Collaboration and Gain a Complete Audit Trail History

database-lifecycle-process-management-1.png

An efficient check-in, check-out process prevents any team member from making changes to database objects or content without first checking the item out. The check-out process locks the object for access by other team members, preventing two development teams from making simultaneous changes to the same objects.

Development collaboration is drastically improved by avoiding out-of-process changes that create conflicts and issues later in the development lifecycle.

When an object is checked in, the developer is prompted to enter the reason for the change, with everything else automatically documented, including who made the change, where, and when – creating a complete audit trail to improve compliance and database governance.

4.  Support for Policy-Based Database Development and Deployment

One of the major challenges for DBA Managers is to differentiate Roles & Responsibilities when it comes to the database. DBA teams can’t deal with code review, but are left to deal with damage control when things break.

The right support regarding Change Policy Enforcement also enables you to harness your Active Directory policies and impose them on the databases in every environment.

For example, you can give developers full access in the development environment and a read credential in the testing environment (if required for debugging purposes), but restrict them from the structure and lookup content changes in the User Acceptance Testing (UAT) and the production environments.

5.  Automation and Collaboration

Boost overall developers and DBAs productivity by 20% through automated processes and collaboration functionality. Also, it helps create a competitive advantage and a Shorter Time-to-Market while reducing time for managing changes.

Database Lifecycle Management: On Point, On Demand

At the end of the day, you need to be the master of your database domain — and that extends to every aspect and phase of lifecycle management. To get the most out of your database, it’s imperative that you maintain strict adherence to well-defined and thoroughly adaptable database change and release policies across all 5 database lifecycle phases — requirements analysis, logical design, physical design, implementation, monitoring and maintenance.

It’s important to bear in mind that the lifecycle is just that — a cycle. It doesn’t necessarily end and you should be prepared to move between lifecycle phases in as fluid and streamlined a manner as possible. Be sure to clearly delineate and enforce roles and responsibilities across every permutation of the database, starting from the pre-development stage.

For DBAs, protecting your company’s valuable data is a primary concern. Having database lifecycle management capabilities can significantly reduce the need for costly delays, application glitches, and backouts.

From streamlining  workflow to providing automated change history tracking, database release automation, and automated merge and build, all with a single source of truth you can trust, simplifying the day-to-day responsibilities of DBAs with complete database lifecycle management capabilities will help avoid unnecesary, time consuming, and expensive delays.

Enjoyed this article? Find out what are the best practices for database management in our next article.

[InfoGraphic] 10 Maxims to Guide the Agile CIO

Agile methodology seeks to transform the functional norms that define your IT operation. When properly executed, it will also transform your business outcomes. Of course, a true embrace of agile cannot be accomplished with a mere checklist; it requires a cultural mind shift from the agile CIO down to the entire organization.

In an effort to bottle that transformative capability, Gartner distilled the most crucial and elusive concepts/tactics of agile into 10 basic maxims. Desiring to make those insights more accessible and more consumable, I’ve taken the liberty of recreating that seminal article in graphic form.

Source Control and Continuous Delivery: Key to DevOps

DevOps is a revolutionary way to release software quickly and efficiently while maintaining a high level of security. While it has proven to be the wave of the future, many still have yet to hop aboard the DevOps train.

In an effort to demystify the unknowns associated with new business practices, Danilo Sato wrote DevOps in Practice: Reliable and Automated Software Delivery, with a mind to upack any anxieties that may besiege an operation considering the move to DevOps.

Sato found that clients were able to appreciate the theory and benefits of DevOps, but putting them into practice seemed to be difficult for most. He found that solving a problem through technology was only possible when users were coached to understand the root of the issue.

You Cannot Do What You Do Not Understand

One of the major problems companies have is that often a process is only introduced as a way to prevent a failure that occurred in the past. The implementation of the process usually solves the problem, but often it introduces new problems, creating a vicious cycle.

To alleviate this problem Danilo recommends investing in automating the process of releasing software, breaking the cycle of piecemeal solutions to large overarching issues.

Sato sees DevOps as an attempt to break the barrier between developer and operator teams. Traditionally two independently operating units in the process, DevOps partially combines these departments and unifies their goals. Developers are responsible to new features and operators keep things running smoothly.

One expression of Sato’s refrain (that a fundamental understanding of the root issue is key) is given substance here. People think that by simply creating “DevOps teams,” they’re practicing DevOps. But DevOps is not a singular entity; it’s a company-spanning change in philosophy, culture, and action. In other words, it’s not enough to put Dev and Ops in the same work place, refer to them as DevOps, and call it a day.

DevOps, Between Theory & Implementation

Some components of DevOps are quite easy to implement. Keeping everything in source control, for example, Sato says, is a good way to begin. The process allows both application and infrastructure changes to be tracked and traced while being quality tested.

Automating the build and deployment process is also at the top of Sato’s list. He states that it reduces human errors and speeds up the software production process.Sato encourages his teams to develop on production-like environments. This process reduces the lack of compatibility often found between development and production (e.g. a developer using a Windows-based machine and deploying to a Linux server or having different visions of libraries and dependencies), allowing problems to be found more quickly.

Automation is vital in many of the processes discussed in the book. Automated processes enable more frequent deployments and shorter cycle times. This, in turn, gives companies metrics which can be used to further improve and build on DevOps. It also allows for businesses to gain a competitive advantages against those not using automation and their customers are often more satisfied.

DevOps: Putting Source Control and Continuous Delivery to Work

In many ways, DevOps is simply the structure that allows for the achievement of continuous software delivery. To this extent, in order to succeed, an operation needs a healthy appreciation for both the philosophy and practical side of things. For most operations, source control is an excellent case in point for DevOps practice, while the goal of truly continuous delivery should underly the philosophy.

Accordingly, DevOps, source control and continuous delivery are the three pillars of a healthy, modern software operation.

DevOps enables continuous improvement. It is constantly hypothesizing, experimenting and collecting the data of the experiments to either validate or reject a hypothesis. This increasing the deployment frequency and it allows for the process to improve every time it runs.

Source control ensures that this experimental, iterative process doesn’t undo any of the good work already validated and built upon further. While continuous delivery, if all goes right, is the outcome.

Following (and indeed understanding) this process is key to empowering businesses to constantly re-evaluate and improve on their existing methods, and at the same time, ensuring that this recursive self-examination does not jeapordize system integrity or cost management.

For those looking to get the ball rolling, Sato recommends teams assess their current process and ask how long it takes and how it can be made faster without compromising quality or security. The answer to these questions are usually found in the canon of DevOps. The trick is realizing that.

If you enjoyed this post, check out our guide on DevOps release process.

Top 5 Best Database Software Solutions for More Agile Development

Figuring out the most efficient database development solutions can prove critical in avoiding potential pitfalls and effectively dealing with today’s rapid business cycles.

“Time is money” is not just a cliché term that is thrown around in long, drawn out, inefficient business meetings. For database development teams, maximizing competence, performance, adaptability, and readiness, will help simplify development and allow automation to achieve repeatable processes, all while avoiding potential risks that create downtime.

DevOps-best-practices

These nimble development solutions are in high demand and are coveted for the ability to streamline processes with an end to end approach that allow developers to simplify development, stay on top of the latest technology innovations, and build modern applications.

 

Among the Best Database Software Solutions

Here are our picks for some of the most valuable and agile database development solutions:

  1. Dell Software Toad Development Suite – Ranked number one in the “Database Development and Optimization” software submarket by IDC in November 2015, Toad allows you to ensure repeatable processes, supported by agile database development, minimizes risks associated with changes to the database, and automate SQL automation. Most notably, its highly visual interface helps reduce the learning curve while supporting a wide variety of database platforms.
  2. Microsoft Visual Studio – Great to use to regulate the development life cycle as an imperative part of your application development. Visual Studio can help implement isolated development environment for each user. Team members can work simultaneously without interfering with other team members or projects. As its major added value, managing database change helps increase coordination efficiency among developers and database administrators.
  3. Oracle SQL Developer – As the first relational database designed for the cloud, it offers users the power and performance of a leading database delivered across a wide variety of the most popular application development technologies. It offers leading security, transaction processing, data warehousing, and big data as well.
    dbmaestro-best-database-software.png
  4. DBmaestro TeamWork – An industry leader in agile database development and deployment, DBmaestro deals with conflicts and merges while relying on a baseline-aware-analysis build engine that “takes your database into safe automation.DBmaestro’s unique technology ensures that a single source of truth is ensured, with their Enforced Source Control. DBmaestro also produces an audit trail that informs the user who did what to the database and limits access with their enhanced database security and regulatory compliance system.
  5. Idera DB Power Studio – One of the under-appreciated but still best database software solutions, this product combines 4 solutions that help build and maintain mission critical database applications, streamline the database change management process, and quickly pinpoint and fix performance bottlenecks:
    1. Multi-platform database administration
    2. Automate and manage complex database schmea changes
    3. Automate SQL Tuning and profiling
    4. Develop SQL code more efficiently

The Evolution of the Agile Database Developer

The goal of a modern database developer is to expand the database’s range of functionality. In this ever expanding industry, the objective remains the same: “help organizations extract value from data, integrate it with new and traditional sources, and ensure quality and security.”

This rapid evolution from rigidly structured data to databases that can handle different data structure, has allowed for each database administrator to work with the developer to customize the intricacies of building, developing, and reserving their database to their specific needs.

Why Stop At Five?

These 5 companies are industry leaders in the field and as mentioned above have streamlined the database development processes. In so doing, they’ve played a big part in helping developers stay on top of the latest  innovations as they strive to build quality modern applications.

That said, we know there are others out there, perhaps less known, but just as worthy of our attention.

We can’t appreciate what we’re blind and deaf to so we’re enlisting your help. Be our eyes and ears. Help make this list more complete. Add your top solutions in the comments below.

Conclusion

The database’s limitations are slowly disappearing, as solutions have been put in place that allow for expedited and more secure processes. For organizations, choosing the best database software offering for your specific development needs is imperative to their long-term success.

If you enjoyed this post, then you might also be interested in the differences between application and database.

The Integration of DevOps and Cybersecurity

Glitches. Security Flaws. Slowdowns. These are all expensive to patch up, and come with negative press, which is hard to recover from. How do the DevOps and Cyber Security teams work together to manage these risks? Especially when the release is time sensitive?

Even those of us that have fully integrated development and operations into DevOps still remember when the teams were in two separate departments. This led to costly challenges that came to light after market. Problems that could have been prevented if development and operations had been centralized.

DB-security

For those that employ DevOps, it’s hard to imagine development and operations as separate departments. DevOps has made monumental strides, but there is still one more step to take to maximize risk management: fully integrate cyber security into DevOps.

Both the DevOps and security personnel need to come to terms with the others’ primary objectives, as DevOps wants to rapidly develop and deploy software, while Cyber Security personnel want to mitigate and manage risk by thoroughly checking for any potential breachable point in the software.

A Growing Friendship

While Cyber Security is currently integrated into DevOps, I think that increasing communication between the two departments will exponentially increase risk management and deal with issues that arise.

At the Symantec Government Symposium, A DevOps programmer once joked that “We
don’t need to have all this security risk management stuff, we don’t need to have cyber security, we need a solution now.”

David Blankenhorn, CTO of DLT Solutions, echoed the sentiment. “The reality of the DevOps environment is not that you’re doing your testing, your security…it’s that you’re doing it on a much more micro scale.”

Cyber Security DevOps.jpg

At AppSecUSA, the annual gathering of the Open Web Application Security Project, white hat hacker Josh Corman argued that’s it’s on the security professionals to adjust to centralized environment of the DevOps teams. “The DevOps tribe is willing to give us a big gushy hug…stop resisting empathy that comes with teamwork.”

Corman reiterated that he believes the root of the disconnect is mutual misunderstanding. “[Cyber Security Professionals] call it mitigation and patching; they [DevOps] call it unscheduled critical work. ” Corman believes that the only way for DevOps to improve efficiency is to increase security and risk management. DevOps is realizing it too.

Immediate Results

Brian A Mchenry Sr, a Sr. Security Solutions Architect at F5 Networks, discussed the advantages of the convergence of the DevOps and Cyber Security worlds in order to increase the ability to minimize and manage risks.

“Embracing SecDevOps as a component of a larger DevOps culture and philosophy enables us to seek out tools and skills that would leverage existing API opportunities and drive decisions toward a more fully integrated approach to SDN.

These new skills and tools may even be an extension of existing practices…SecDevOps would help automate and orchestrate any needed changes in the security service chains.”

SecDevOps integrates security measures into its development and deployment philosophy, as security has always and will always be an integral part of the software product life cycle. However, there are more solutions to be discovered that will result from the coming together of the worlds of Cyber Security and DevOps.

The Future

The transition into “DevSecOps” will open the door for a more dynamic and secure way of managing infrastructure and automated deployment. As we work towards maximize risk management and prevention, flexibility, speed, time to market, AND security will be equally prioritized.

CA Release Automation with DBmaestro: Step-by-step Guide

CA Release Automation is an enterprise-class, continuous delivery solution that automates complex, multi-tier release deployments through orchestration and promotion of applications from development through production.

CA Release Automation speeds up application release cycles, and improves business and operational agility. It reduces errors, and achieves higher quality releases by  simplifying and standardizing application release processes. Finally, it reduces costs of application deployments, and promotes collaboration and alignment between Development and Operations

database-automation

DBmaestro together with CA Release Automation ensures your database is included in your DevOps and continuous delivery processes. Covering everything from development best practices to database impact analysis based deployment automation – your database can now be handled with the same standards you use for your application code.

 

Using Your CA Release Automation Tool

We’re going to show you how to use your CA Release Automation tool to best benefit your company using DBmaestro DevOps Platform.

Once the action pack is properly installed and loaded, you can assign actions to your release processes.

Under the designer in the process design, you can choose the components tab to start adding actions available from DBmaestro TeamWork.

ca release automation process

If you choose the command “Add an Action”, you can see the available commands by expanding the DBmaestro TeamWork folder. The actions are grouped by “simple” and “advanced” actions. The distinction between the actions gives DBmaestro’s DevOps Platform users addition flexibility and control into their automation and design process.

ca release automation add action

When an action is selected, the input fields are displayed to provide information to the actions being performed. The input depends on the selected act and references information from the DBmaestro TeamWork Pipeline Builder.

ca release automation pipeline

The DBmaestro Pipeline Builder is a visual resource enabling you to package, verify, deploy, and promote database changes just as you would with your application code allowing you to build and visualize a full delivery pipeline. It defines mapping relevant to your environment.

A Unique Automation Deployment Plan Per Environment

Each pipeline project represents a unique automation deployment plan for a predefined environment and relationships between those environments, enabling a quick, one-time design automation flow mapping for later usage in build and deploy processes.

ca release automation deploy process

The build latest version action asks for details such as the pipeline name, the environment name, and a new label name. Continue to add the actions needed to define the deployment processes for the database changes.

ca release automation versioning

Change Validation

Here is an example of a flow to validate and deploy the database changes:

The flow starts by building the latest version of the database change. Then it runs a validation to confirm there is no configuration drift in the target and then deploys the change to the target environment.

ca release automation validate

Circling Back to the Why of the Matter

As has been demonstrated, the release automation helps you automate your database deployments into your continuous delivery pipeline.

It helps IT operations, development teams and application owners to speed up application release cycles, and improve business and operational agility, reduce errors, and achieve higher quality releases by simplifying and standardizing application release processes and finally, to  reduce costs of application deployments, and promote collaboration and alignment between Development and Operations.

In other words, this partnership between DBmaestro and CA allows for DevOps for the database – and throughout the entire application development process.

David Rosi: Don’t Leave the Database Out of the DevOps Toolchain

The DevOps Enterprise Summit in San Francisco is an extremely important conference which provides a measuring stick for the progress the DevOps industry has made over the past year. For our Americanreaders, it’s “The State of The Union” of DevOps.

The best and the brightest of the industry come together to talk trends, successes, challenges, to learn from one another about the latest in the DevOps Toolchain and exchange ideas. It is also a place where those that are considering adopting DevOps come to learn about the resources offered.

As it happened, the DevOps Enterprise Summit also made for the perfect opportunity for DBmaestro’s newly appointed EVP of North American Operations, David Rosi, to make his first public appearance as a member of the company.

devops-solution-demo

In an interview conducted by Matt Hines from DevOps TV, Rossi talked about the database being left behind, and how DBmaestro is changing perceptions. “A couple of years ago, people were not thinking about the database. They were doing things manually, writing scripts, doing simple compare and sync analysis. They weren’t doing what they now need to do. We need to bring all those pieces together to be successful.”

DevOps Toolchain

Rosi, who is leading DBmaestro’s North America’s Operations based in the Boston area, addressed the increased attention the database has received at this year’s conference.

“The uptick that we’ve all seen in the use of tools on the application side is putting the database under more of a spotlight, so more and more companies realize they need to address the problem. Clients are starting to engage and show classic behavior in a market that’s starting to ramp up.”

Source Control

Rosi understands the buzzwords associated with the DevOps culture; speed, scale, automation. “But those things on their own is not enough,” he says.

DevOps Toolchain-1.jpg

“We looked at source control, for example, that’s different in a database world, and then the ability to automate what you now have trust in, so that you don’t have anybody playing around with your database. Having that security and reporting that’s going to tell you anything and everything you need to know about what happened with the database is the most important thing.”

Database Automation

Rosi agreed that a few years ago, security and automation were words that made people uncomfortable when seen together. But now, with the leveraging of automation early on, a lot of the problems that could become security headaches down the road, are going to be rooted out early on.

The full interview can be seen here.

If you enjoyed this post, check out our article on which tools to use for incorporating database into DevOps tool chain.