Have you met an enterprise software auditor? If you’re working in IT in a large organization, the answer is probably yes.
Employed by large enterprise software vendors, these software police reserve the right to sniff infrastructure in search of isolated instances. And if they find it, you may find a big bill to land at your doorstep.
Martin Thompson, founder of the ITAM Forum, told Computer Weekly last year: auditing is a proven way to increase revenue.
Customers pay a lot for those software licenses. You generally see perpetual license fees along with software update contracts. The latter adds more than one-fifth of the total purchase price for Oracle software licenses. This starts at $ 47,000 per processor in a bare metal installation. It also depends on the type of processor you are using. x86 and Power have different pricing schemes for Oracle pricing.
These licenses are also inflexible, warns Wendinoy, head of the Americas Database Specialist SA at Amazon Web Services (AWS). The conditions are so strict that the customer must pay the amount equivalent to the capacity planning tax. Those Neus talk to complain about the two negative effects. First of all, it cannot be easily scaled up. This is a problem in environments with unstable workloads. On the other hand, you can’t scale down, which makes your idle capacity budget huge.
One alternative is to decamp in the cloud. In the cloud, at least technically, you can scale up and down as needed. However, software vendors often find it more difficult to run database instances there.
For example, Oracle Cloud licensing policy When hyper-threading is enabled, it counts virtual processors differently than physical processors and treats two virtual CPUs (vCPUs) as a single real processor.
Its cloud licensing policy does not cover some Oracle software at all. If you need an Oracle Real Application Cluster (RAC) license in your cloud environment, you need to look for a dedicated host or VMware Cloud. This allows you to tell the vendor exactly how many physical processors you want to map in the cloud.
For some customers, porting an existing legacy database workload to the cloud can be too problematic. Instead, it may be easier to completely revoke your on-premises license and buy a managed service that has a built-in license, without having to deal with a software vendor at all.
Clients are increasingly looking for this kind of relationship, according to Neu, who says they are shy about guns after years of tough fees.
“They don’t want contracts or partnerships, just as you don’t want mobile phone contracts that you can’t get out of,” she says. “They don’t want to pay for things they don’t use or need more and can’t get without intervention.”
She explains that customers who used AWS-managed databases noticed this flexibility during the pandemic. For those who have a courier business model, orders suddenly surged overnight. Without cloud scaling, databases would have fallen apart at seams.
“Customers who were already in the cloud were able to scale up,” she adds. “They didn’t have to ask anyone, they didn’t have to sign additional contracts, they just had to spin up another instance.”
The flexibility Neu describes goes beyond scaling. She also explains that data portability between different models is improved.
House of Brick, an IT service management company, Insist One of the biggest challenges for companies running Oracle databases in-house is organizing. The rigorous nature of many Oracle database implementations creates silos that the team responsible for maintaining them is reluctant to change, making it impossible for the business to evolve as the data platform stagnates.
In one client organization, things got so bad that some companies were using an Oracle version that was unsupported 10 years ago. According to service companies, developing a database cloud strategy can make changes more manageable, eliminate organizational silos, and help database teams meet their business needs.
Moving to a stack-managed cloud-based environment creates a more flexible environment that better supports application changes. There are also more options for developers. -Companies that want to optimize their data models may need a wider range of technology solutions than those supported by the relational model. They can include AI, complex relationship mapping, and even streaming analysis if you are using specialized applications such as industrial sensors. This usually means adopting other data models such as graphs, streaming, and document-based systems. You may need to upgrade from a rigid table to a key-value store for better scalability.
Applications may require different data models on the back end. For example, an enterprise may move a lot of data to Amazon’s proprietary Aurora system while shifting other workloads to DynamoDB’s key-value store to improve performance.
Running these new data engines on-premises can be a daunting suggestion for many companies, especially when their models need to be combined to support a variety of applications. Also, upgrading the application architecture to make container-based microservices and serverless applications more modular requires tight integration with the database engine.
This all illustrates the migration of managed databases in the cloud, where many of the integrations are built into the platform. This reduces developer effort and learning time, and reduces time to market.
Enterprises can use homemade databases developed directly by cloud service providers. Alternatively, if you already have on-premises versions of various open source databases, you can move to a compatible API in the cloud to eliminate direct maintenance and support contracts.
Migrating from older on-premises systems for licensing and scalability benefits is a good opportunity to explore new technology opportunities and may free you from other architectural constraints. Neu refers to the Amazon Aurora relational database with read replicas for high availability and performance scaling.
This all sounds utopian, but many companies face the challenge of migrating from a long-standing on-premises database architecture and embracing business rules that are layered directly into the system by a long-term DBA. Migrations are often costly and stressful. This is how existing database companies keep people in unfair on-premises contracts.
It may be like stripping a giant bandage, but the migration makes it easy to eliminate what the House of Brick calls an “Oracle ChangeFear Factor” that exists in your on-premises installation. Testing and applying regular patches issued by vendors to database products can change operational performance and is anxious. In a managed database environment, these issues are resolved.
Neu claims that migration is a one-time cost and helps AWS data migration services bear the burden. It provides a set of tools for assessing source and target workloads and data, allowing enterprises to decide what to move and refactor.
The numbers seem attractive. We investigated migration costs and mapped them to cost benefits, including not only license savings, but also the elimination of tedious and inefficient day-to-day operations. Customers who spoke to us reported a 264% ROI in the three years after migrating their database deployments to RDS, and a 39% reduction in operating costs over the same period.
Enterprises should not migrate from their on-premises licensing structure without assessing migration costs, application roadmaps, projected operating costs of cloud architectures, and current issues they are experiencing with software licensing. However, initial assessments often indicate that at least this requires more attention.
In addition to the AWS Database Migration Service tools, AWS offers programs such as assessments, incentives, and end-to-end migrations that include professionals to help enterprise customers migrate their critical workloads to the AWS platform. Offers. As the company says, this should “really eliminate the risk of migration.”
It’s chock full of useful advice, exclusive events and interesting articles. Don’t miss out!