What You Need to Know to Migrate Your Oracle Database to the Cloud

By | Oracle
Many organizations are making the switch from on-prem to cloud databases. These cloud native warehouses can provide more reliable and timely access, cost-efficiency, and automated backup and recovery. If you are using any on-premises version of Oracle Database, you might want to consider whether moving to the Oracle Database Cloud is right for your company. There are numerous migration methods, and the process can be very simple when you work with an expert. Try reviewing these key factors prior to selecting a migration method and save yourself some additional time. Know Your On-Premises Operating System/Platform and Version – Ensure you have a full understanding of the platform your company is currently working on along with the version. Are you using Windows, Mac OS, or Linux? There are some operating requirements that will be applicable depending on the migration method you are interested in adopting. Know Your On-Premises Database Version & Cloud Database Version – Are you currently using Oracle Database 11g, 12c CDB, or 12c Non-CDB? Depending on your response and the cloud version you are transferring to, Oracle has multiple migration options for you to choose from. For example, a move from on-premises Oracle Database 11g to Oracle Database (more…)
Backup Recovery

Improving your Database Security with Microsoft Azure

By | Azure
If you are interested in migrating to a cloud database but concerned about security issues, look no further. Hybrid and cloud native databases do have their own set of security concerns, but if you go with a trusted company and maintain it appropriately, you have nothing to fear. In fact, cloud databases can be more secure since they have access to the most advanced security protocols and automatic updates. Cloud native databases, or DBaaS (Database as a Service) platforms are databases hosted and maintained by third party providers. Operating a DBaaS means that your company will access data through the internet instead of through your internal network. DBaaS solutions offer great benefits for organizations in today’s “Big Data” arena. DBaaS is more scalable and flexible than traditional VM databases. It also eliminates the hardware requirements of maintain a database system. It is crucial to understand and trust the security protocols of your cloud provider before making the switch. Microsoft Azure is at the top of the list when it comes to cloud database offerings. It is a particularly good choice for the security conscious client. Microsoft in general, and particularly Azure, have a long history of prioritizing security for its (more…)

5 Tips for Managing Multiple Cloud Providers

By | Azure, Database Technology Trends, Oracle, SAP HANA, SQL Server, Sybase
Cloud adoption continues to rise and shows no signs of stopping anytime soon. As companies become more comfortable with the cloud platform as a tool, they are moving farther from on-prem setups. The newest trend shows many groups now using multi-cloud solutions. RightScale 201 State of the Cloud Report found that 81% of respondents have a multi-cloud environment. The organizational average was just under 5 clouds, including production and test environments. Why Multi-Cloud? – Each cloud provider has different strengths and weaknesses to offer. Using multiple clouds lets you take advantage of the best tools and applications for your workloads without limiting your resources. It can also be strategic as companies can shift workloads to their various platforms depending on current prices and availability parameters. Adopting a multi-cloud environment can help take your workload management to the next level, but it can also come with some challenges to prepare for. Here are 5 tips to help you manage multiple cloud providers. Choose Wisely – Don’t be suckered in by low rates and storage fees. Those numbers can (and will) change. The more important factor is to select the clouds that have key features you need. Look at the applications and (more…)

Data Science: Making the Most of your Big Data

By | Azure, Database Technology Trends, Oracle, SQL Server, Sybase
As today’s trends bring more and more data to our fingertips, the next challenge is figuring out what to do with it all. The insights gained from proper data analysis have major implications for business. Proper data analysis takes careful consideration, appropriate infrastructure, and understanding of data science. What is Data Science? This is the process of using data to draw conclusions or predict outcomes. You probably use data science already. For example, a restaurant owner may keep records of which dishes are ordered. Over time, he may notice that chili is ordered more frequently on rainy days than on sunny days. This may encourage him to prepare larger batches of chili when the weather calls for rain. This is a very simple example of data science. You can see how understanding patterns and trends in data can help prepare your business. Data science can contribute to the following business improvements: Streamline Processes – You can use data science to analyze your business processes and identify areas to streamline. Looking at your website usage statistics may show areas where your potential clients are losing focus, or identify slow and frustrating pages in your online ordering forms. Identifying these patterns can (more…)

Simplifying Your Analytics with Azure Databricks

By | Azure
Big data and analytics go hand in hand. If you are handling big data, then you understand the power and the pitfalls this brings. Advanced analytic options, AI and real-time processing are useful tools. But to balance this power you may end up sacrificing productivity and ease of use. Azure Databricks provides a high-performance analytics engine optimized for your Azure cloud database. It’s incredibly easy to deploy, even for large-scale enterprises. Free up your team for the more important stuff. What is Databricks Azure Databricks is a collaboration between Microsoft and Databricks to provide an optimized analytics platform for a large-scale cloud environment. So, what does it do? First, let’s look at Databricks. Basically, Databricks is Apache Spark made more accessible to a large-scale enterprise environment. If you aren’t familiar with Apache Spark, it is the open-source analytics engine developed out of UC Berkeley that has gained significant traction in the past 5 years. It has become a leading platform with widespread use at the enterprise level. It has some drawbacks, though. It is complicated to deploy, and the larger your organization, the more complicated it becomes. Enter Databricks. This managed version automates a lot of the processes, making deployment (more…)

3 Reasons to Consider Oracle EPM Cloud

By | Oracle
As a business leader, it’s up to you to make some big decisions about your data and processes. It can be tricky to figure out which platform is right for your company. There are no clear answers, except to find out what’s right for you. If your current system has room for improvement, you may want to consider Oracle’s EPM Cloud. The comprehensive data solution streamlines your business processes, providing innovation, flexibility and state-of-the-art security. Innovation In a recent survey conducted by Oracle, 81% of finance leaders reported their top-rated benefit of using Oracle’s EPM Cloud was consistent access to innovative technology. With Oracle’s EPM Cloud you are always at the front of new innovations and technology. Most groundbreaking tech is being released on the cloud, and it pays to take advantage of the platform. Cutting-edge analytics, like Machine Learning and AI, can analyze your data and detect patterns that human eyes would never catch. With this advanced analysis you can make more strategic decisions. Always up-to-date technology also means more options for automation. This saves you time and money and reduces the likelihood of errors. This newfound time can be better spent in strategy than in rote tasks. Flexibility (more…)

Preparing for the General Data Protection Regulation (GDPR)

By | Security
The General Data Protection Regulation (GDPR) becomes fully effective tomorrow. The time is finally upon us. If that makes you nervous, relax, we’ve got your back. I’m sure you’ve been thinking a lot about these regulations recently, doing whatever you can to prepare. So have we. Our team is very familiar with the legislation. We have been working tirelessly to ensure that all our services are in compliance. Any third parties have been thoroughly vetted, and our security experts are prepared to review your situation. Here is a quick overview of the important points of the GDPR and how our services fit in. General Data Protection Regulation The GDPR is a new set of regulations by the European Union (EU) that will come into effect on May 25th.  Personal data is spreading at an unprecedented rate today. With the GDPR, the EU plans to get in front of the information sprawl to protect personal privacy. The new legislation will unify the practices and penalties of member states. This means both businesses and individuals know what to expect regarding privacy laws no matter where they are interacting. All EU citizens will be under its protection. If you are not actively doing (more…)
PoweredUp

Raju Chidambaram To Speak At Tampa Bay’s PoweredUP Festival

By | Press Releases
Dobler Consulting’s CTO Joins Distinguished Panel to Explore The Science of Data Tampa Bay, FL. May 21st, 2018 — One of the factors driving the growth and recognition of Tampa Bay as an emergent technology hub is the annual PoweredUP Festival. This year Dobler Consulting is not only proud to be a Sponsor of the event and it is also proud to have their CTO Raju Chidambaram as a member on the Science of Data Panel, to share his thoughts and insights on current and future trends. Joining Raju on the panel will be data scientist from Nielsen and Agilethought. “I am really looking forward to this panels deep dive into the science of data.” said Raju “We are in the initial stages of one of the world’s great migrations – the mass migration of data into the cloud. The assumptions we make today about the science of data, the design of data and the security of data will affect humankind in more profound ways than any other migration in human history. It is simply that important.” The Festival on May 23rd at the Mahaffey Theater annually draws over 1,000 attendees. This year, in addition to the Science of Data (more…)

Peter Dobler of Dobler Consulting to Take Over Role as President of TBTLA

By | Company News, General Announcements
The start of 2018 is accompanied by exciting news for Peter Dobler, Founder & CEO of Dobler Consulting. With Dobler Consulting listed as one of the Top 50 Fastest Growing Companies by the Tampa Bay Business Journal and a Finalist for the Small Business of the Year awards, it comes as no surprise that Peter Dobler has been appointed to take on the role as the next President of the TBTLA.   The TBTLA, or Tampa Bay Technology Leadership Association, is a local non-profit organization whose membership is limited to current and former technology executives. Its two-fold mission is to apply the collective experience of the membership to real-world technology problems, and to mentor students – and other professionals – who are anticipating careers in information technology[1]. The most important way the TBTLA achieves this goal is through their successful GETSMART program. Established in 2001, the TBTLA is one of the many reasons why Tampa Bay is becoming a booming technology hub for business of all sizes.   The TBTLA partnered with the University of Tampa’s Information & Technology Management department to create GETSMART. GETSMART, short for Getting Everyone to Study Math and Related Technologies, is an educational and mentoring (more…)
sybase recovery services

In-Memory Databases vs Intel NVMe NUMA

By | Blog, Database Technology Trends, SAP HANA, Sybase | No Comments
For the past few years, the hot topic for database vendors was the in-memory database. The general belief was and maybe still is, that server memory is getting cheaper and density will continuously improve. For the most part, this has been the case, maybe not as extreme as predicted. However, the data storage needs for databases increased at a much faster pace during the same time, which created another problem. Going beyond 1TB of memory in a single server will quickly become very expensive. A 1TB database used to be called very large. This attribute is now applied to the 5-10 TB databases servers. In-memory databases are needed where read and write performance are equally important. Traditional databases improve read performance by caching data in memory, but there is no way to improve write performance. To be ACID (Atomic, Consistent, Isolated and Durable) compliant, data must be written to persistent storage. An in-memory database achieves this by writing every transaction in parallel to a high-performance storage. In many cases this is a flash drive on the PCI bus or a SSD drive. But every write action must be stored away from Memory (RAM) to be ACID compliant. There are many (more…)