Scroll Top

Department of Defense Strengthens Its Emphasis on Data

TAKE NOTE (Insights and Emerging Technology)

The Defense Department is transforming itself into a data-centric organization, and is tasking its chief data officer with new responsibilities to help it along its way.

In a memo, signed by Deputy Defense Secretary Kathleen Hicks, the Pentagon adopted five “data decrees” to “generate transformative proficiency and efficiency gains across the DoD Data Strategy’s focus areas.”

Hicks wrote that the changes are “critical to improving performance and creating decision advantage at all echelons from the battlespace to the board room, ensuring U.S. competitive advantage. To accelerate the department’s efforts, leaders must ensure all DoD data is visible, accessible, understandable, linked, trustworthy, interoperable, and secure.”systems.

The decrees state that the Pentagon will maximize data sharing and rights for data use, publish data assets in a federated catalog, and make data useable by artificial intelligence and machines. DoD will also store data in a safe manner that is uncoupled from hardware and software and implement best practices to secure authentication, access management, encryption and protection of data.

Hicks’ memo directs the creation of a Data Council for all DoD components to coordinate data activities, and to appoint data leaders to manage data throughout its lifecycle and promote data literacy.

Under the new guidance, DoD’s chief data officer will be responsible for issuing policy and guidance regarding the Pentagon’s data ecosystem, sharing, architecture, lifecycle management and data-ready workforce.

The CDO is also tasked with working with the Joint Staff and Joint Artificial Intelligence Center to make it easier for weapons systems and different components in DoD to talk to each other.

“In order to rapidly field an enterprise data management solution, the Department will seek to scale existing capabilities that have proven themselves in the battlespace and in real-world operations, simulations, experiments, and demonstrations,” the memo states.

The memo solidifies the Advancing Analytics platform (Advana) as the single authoritative enterprise data management and analytics platform for the Pentagon.

Read More

Interested in learning more about RPA? Download our FREE White Paper on “Embracing the Future of Work”

UNDER DEVELOPMENT (Insights for Developers)

Understanding The Different SAP Cloud Offerings

Intro


Cloud computing is one of the hottest buzzwords in technology. It appears 48 million times on the Internet. But amidst all the chatter, cloud computing caused a paradigm shift in the ways software vendors provided their solutions to consumers. SAP’s offerings can be quite confusing… There is HANA Enterprise Cloud, and SAP Cloud Platform (previously HANA Cloud Platform), and HANA Cloud.

That’s enough acronyms to confuse anyone. In this blog we will attempt to demystify the SAP trichotomy…

SAP is among the leading cloud-based software vendors with 41 data centers worldwide and 110 million subscribers. It introduced SAP HANA, an in-memory technology that drives its cloud computing in early 2010. With SAP S/4HANA and SAP HANA, the company started the acquisitions of Concur, SuccessFactors and Ariba, small software companies, in the first five years of the 2010s to spearhead its expansion to the cloud. SuccessFactors dealt with human capital management applications, Ariba managed procurement activities whereas Concur was an expense and travel management solution. SAP S/4HANA Cloud was introduced in 2016 while SAP HANA underwent rebranding to become SAP Cloud Platform the following year.

SAP continues to encourage its clients to take advantage of its cloud offerings rather than remain on on-premise landscapes. Even so, most are reluctant because they find SAP cloud complicated. The information below will attempt to simplify the innovative cloud-based solutions from SAP so that you can benefit from what cloud-based software offers..

The SAP Cloud Service Models

There are three types of cloud service models, including:

  • IaaS {infrastructure-as-a-service}
  • PaaS {platform-as-a-service}
  • SaaS {software-as-a-service}

With IaaS, your company will outsource the equipment you need to support your operations. These include hardware, servers, networking components and storage. The IaaS provider remains the equipment’s owner so he/she handles its maintenance, running and housing. IaaS, in general, offers a virtual processing machine with storage and availability for which you pay on a per-use basis. It is often a foundation for more extensive cloud service offerings and works best for companies with temporary infrastructural needs or those shifting to operating from capital expenditure. Think, DigitalOcean, Linode, Rackspace, Amazon Web Services (AWS), Cisco Metapod, Microsoft Azure, Google Compute Engine (GCE).

In PaaS, you get the entire platform for software along with its related services like connectivity, authentication and persistence. This model offers a development platform that will enable clients, partners, and independent software vendors to create innovative cloud-based software applications fast. Think, AWS Elastic Beanstalk, Windows Azure, Heroku, Force.com, Google App Engine, Apache Stratos, OpenShift.

SaaS is sometimes known as on-demand software. In this model, software and its associated data are hosted centrally on the cloud. The applications you need are managed and delivered remotely via a standard web browser and secure internet connection. Access to the solutions is charged as a subscription at a fixed cost per user, often monthly. SaaS is primarily used for non-differentiating and non-core applications like procurement, human resource or customer relationship management applications. Think, Google Workspace, Dropbox, Salesforce, Cisco WebEx, Concur, and Zoom.

Deployment options for SAP Cloud

The deployment option for your software denotes the approach a vendor uses to provide it on the cloud. Diverse licensing, software life cycle management and infrastructure offerings differentiate it. The following are the common deployment models for cloud-based software:

  • Hybrid cloud: This combines private and public applications to establish an IT landscape that guarantees the best balance between security and convenience.
  • Public cloud: Here, the resources are hosted at a vendor’s premises, then shared by multiple clients accessing them via the web.
  • Managed cloud: Here, resources are dedicated to a single client who accesses them via a VPN. The software vendor operates the infrastructure in his/her data center.
  • Private cloud: Here, resources are only dedicated to a specific client who owns the infrastructure. A third-party or the client owns and manages the infrastructure that is on his/her premises.

With an understanding of the service models and deployment options for cloud-based software, here are the available options for your company from SAP. First, we will take a look at SAP HANA Enterprise Cloud. From here we will narrow our scope and look at SAP HANA Cloud platform to understand the cloud foundry environment a bit and finally arrive at SAP HANA Cloud including any new innovations…

Read More

– Dig Deeper –
SAP Cloud Platform in 10 Minutes

Q&A (Post your questions and get the answers you need)

Q. What is a Hyperscaler? We are looking to move to the cloud and this term is used a lot.

A. No worries. Before we talk about Hyperscalers, lets briefly define what Hyperscale is first…

The term “Hyperscale” refers to scalable cloud computing systems in which a very large number of servers are networked together. The number of servers used at any one time can increase or decrease to respond to changing requirements. This means the network can efficiently handle both large and small volumes of data traffic.

If a network is described as “scalable”, it can adapt to changing performance requirements. Hyperscale servers are small, simple systems that are designed for a specific purpose. To achieve scalability, the servers are networked together “horizontally”. This means that to increase the computing power of an IT system, additional server capacity is added. Horizontal scaling is also referred to as “scaling out”.

The alternative solution – vertical scaling or scaling up – involves upgrading an existing local system, for example, by using better hardware: more RAM, a faster CPU, more powerful hard drives, etc. In practice, vertical scaling usually comes first. In other words, local systems are upgraded as far as is technically feasible, or as much as the hardware budget permits, at which point the only remaining solution is generally Hyperscale.

OK, now to answer your question what is a Hyperscaler….

A Hyperscaler is the operator of a data center that offers scalable cloud computing services. The first company to enter this market was Amazon in 2006, with Amazon Web Services (AWS). AWS is a subsidiary of Amazon that helps to optimize use of Amazon’s data centers around the world. AWS also offers an extensive range of specific services. It holds a market share of around 40%.

The other two big players in this market are Microsoft, with its Azure service (2010), and the Google Cloud Platform (2010). IBM is also a major provider of hyperscale computing solutions. All of these companies also work with approved partners to offer their technical services via local data centers in specific countries. This is an important consideration for many companies, especially since the General Data Protection Regulation came into force.

Cheers!