Defense Department’s massive winner-take-all $10 billion cloud contract dubbed the Joint Enterprise Defense Infrastructure (or JEDI for short).
Star Wars references aside, this contract is huge, even by government standards.The Pentagon would like a single cloud vendor to build out its enterprise cloud, believing rightly or wrongly that this is the best approach to maintain focus and control of their cloud strategy.
Nikhilesh De / CoinDesk:
Cloudflare debuts IPFS Gateway, an easy way to access content from the P2P-driven InterPlanetary File System that does not require installing special software — Internet security provider Cloudflare is introducing a new product to help users more easily access the InterPlanetary File System …
Internet security provider Cloudflare is introducing a new product to help users more easily access the InterPlanetary File System (IPFS), the decentralized storage protocol developed by Protocol Labs.
In a blog post Monday, Cloudflare announced it was launching a “Crypto Week,” where it will announce “support for a new technology that uses cryptography to make the internet better” every day. The first of these technologies is a portal to more easily access IPFS, as well as build websites on top of the technology.
In a second post, the company explains that the peer-to-peer nature of IPFS provides a number of redundancies for users trying to access a specific website or piece of data.
The second feature revolves around the fact that users can request data using hash values, rather than IP addresses
To ensure users can access data stored through IPFS, Cloudflare is offering a gateway which delivers content using Hyper Text Transfer Protocol Secure (HTTPS)
This change in data center landscape ultimately is a result of the mass emergence of the cloud in 2016.
Consolidation, consolidation, consolidation. Not in terms of just organizational consolidations, but also in terms of data centers. Several reports started to use the phrase “hyperscale” to describe the new outsourced data centers that are used by cloud providers. Gartner predicts “by 2025, 80% of enterprises will have shut down their traditional data center,” versus only 10% today.(1) What does this mean?
At its Ignite conference this week, Microsoft announced improved security features for Azure with the addition of Microsoft Authenticator, Azure Firewall, and several other tools to the cloud computing platform.
After announcing Azure Active Directory (AD) Password Protection in June to combat bad passwords, Microsoft is now bringing password-less logins to Azure AD connected apps with the addition of support for Microsoft Authenticator.
Ujjwal Pugalia / Amazon Web Services:
AWS announces that YubiKey can now be used as a multi-factor authentication (MFA) device to sign into the AWS management console — AWS Identity and Access Management (IAM) best practice is to require all IAM and root users in your account to sign into the AWS Management Console with multi-factor authentication (MFA).
Mike Wheatley / SiliconANGLE:
AWS rolls out new EC2 high memory instances for in-memory databases like SAP HANA, with 6, 9, and 12 TB memory, plans instances with 18 TB and 24 TB for 2019 — Amazon Web Services Inc. today offered some revved-up cloud computing instances aimed at customers running resource-intensive “in-memory” databases such as SAP SE’s HANA.
Amazon Web Services Inc. today offered some revved-up cloud computing instances aimed at customers running resource-intensive “in-memory” databases such as SAP SE’s HANA.
The new High Memory EC2 instances offer a choice of 6, 9 and 12 terabytes of memory, with 18TB and 24TB options to arrive next year.
AWS Chief Evangelist Jeff Barr said in a blog post that the idea is to allow customers to run in-memory databases, which keep entire datasets in short-term RAM memory, closer to the enterprise applications they interact with.
THE PENTAGON DIDN’T mince words in March when introducing the Joint Enterprise Defense Initiative, or JEDI Cloud program: “This program is truly about increasing the lethality of our department and providing the best resources to our men and women in uniform,” the Defense Department’s chief management officer, John H. Gibson II, told industry leaders and academics at a public event. JEDI aims to bring DOD’s computing systems into the 21st century by moving them into the cloud. The Pentagon says that will help it inject artificial intelligence into its data analysis and equip soldiers with real-time data during missions, among other benefits. For the winning bidder, the contract will be lucrative: Bloomberg Government estimates it will be worth $10 billion over the next decade.
Or at least that’s how it should have worked.
the story of the JEDI contract reads more like a bizarre corporate thriller than a simple cloud-computing deal.
Case in point: Late Monday evening, Google announced it no longer plans to submit a bid by Friday, the deadline for interested applicants.
It’s a reasonable assessment, given that developing the JEDI program would conflict with Google’s recently codified stance against developing or deploying artificial intelligence technologies that “cause or are likely to cause overall harm,” as the digital backbone of the Pentagon will certainly do in one way or another. However, it was also a bit of a gimme, as Google probably wouldn’t have won the contract anyway.
That’s because Amazon has been considered the clear front-runner for the JEDI Cloud deal since the details of the contract were first released.
The Pentagon’s 1,375-page winner-take-all request for proposal for JEDI is a web of restrictions and requirements that some critics allege leaves few viable candidates beyond Amazon.
The Pentagon announced the Joint Enterprise Defense Infrastructure (JEDI) project earlier this year, prompting a feeding frenzy among big tech companies eager to get the lucrative contract. Appetites have curbed a bit amid accusations the JEDI contract wasn’t ever up for grabs.
Project JEDI is a $10 billion defense contract for a single commercial provider to build a cloud computing platform to support weapons systems and classified data storage.
The U.S. Department of Defense faces a critically important challenge as it solicits bids for JEDI, the Joint Enterprise Defense Infrastructure. JEDI is intended to modernize and consolidate the defense department’s IT systems into an enterprise-level commercial cloud.
The importance of this transition cannot be overstated: JEDI will be the foundation for integrating advanced technologies such as artificial intelligence and augmented reality into America’s warfighting capability.
Unfortunately, JEDI, as outlined in the final solicitation, would not provide the strongest possible foundation for the 21st century battlefield.
For that reason, IBM today filed a protest of the JEDI solicitation with the U.S. Government Accountability Office.
– IBM knows what it takes to build a world-class cloud. No business in the world would build a cloud the way JEDI would and then lock in to it for a decade. JEDI turns its back on the preferences of Congress and the administration, is a bad use of taxpayer dollars and was written with just one company in mind. America’s warfighters deserve better. –
The Pentagon announced the Joint Enterprise Defense Infrastructure (JEDI) project earlier this year, prompting a feeding frenzy among big tech companies eager to get the lucrative contract. Appetites have curbed a bit amid accusations the JEDI contract wasn’t ever up for grabs.
Project JEDI is a $10 billion defense contract for a single commercial provider to build a cloud computing platform to support weapons systems and classified data storage.
Khari Johnson / VentureBeat:
GitHub debuts Actions for devs to automate workflows and build, share, and execute code inside containers, security alerts for Java and .NET projects, and more
The GitHub code repository, which has been used by 31 million developers around the world in the past year, today announced a sweeping series of changes, including Actions, a new way for developers to automate workflows and build, share, and execute code inside containers on GitHub.
In a phone interview with VentureBeat, GitHub head of platform Sam Lambert called Actions the “biggest thing we’ve done since the pull request” and compared it to Shortcuts for automating workflows introduced for iOS 12 last month.
Actions will be made available in limited public beta for Developer, Team, and Business Cloud plans on GitHub. They’re designed to make it possible for any team to adopt the best workflows
“A lot of the major clouds have built products for sysadmins and not really for developers, and we want to hand power and flexibility back to the developer
A number of security measures were also made available today, including the Security Advisory API for access to all vulnerabilities found by GitHub for integration into your existing tools and services.
Security vulnerability alerts for Java and .NET code were introduced today to deliver automated notifications and insights into how to fix issues with your code. Proactive security alerts were first introduced last year for Ruby, JavaScript, and Python.
Also new: token scanning for public repositories. This feature allows GitHub to alert a developer or even their cloud provider if secret keys or passwords are, for example, accidentally pushed into a public channel.
Fujitsu may have given up the ghost on its own K5 cloud, but it is promising to throw a ton of human resources at selling and managing Microsoft Azure – its public cloud service of choice.
As revealed yesterday, Fujitsu decided to kill the platform in all regions outside of Japan with immediate effect: K5 was sold and deployed as public, private, private virtual or private hosted services. Sales were clearly off-target with no recovery in sight.
A spokesman at the company told us it is selling multiple clouds from vendors including VMware, Oracle, SAP and Microsoft Azure – though it has yet to start flogging Azure Stack.
To put Fujitsu’s efforts into some kind of context, AWS earlier this year said at its Public Sector Summit that it planned to train 100,000 heads this year to keep expanding its tendrils.
The cloud services arm of Amazon said that in Europe alone some 350,000 techies are required, “and we need to help fill those job skills – and quickly”, said Teresa Carlson, veep of worldwide public sector.
Fujitsu said it already uses Azure to host some digital services in “areas as diverse as digital ticketing, legacy modernisation and AI”.
Amazon CEO Jeff Bezos has defended his company’s bids for US military contracts.
“This is a great country and it does need to be defended,” he said at a conference on Monday.
His remarks are an implicit rebuke of Google, which has recently dropped out of the running for the $10 billion JEDI defense contract, saying that it could conflict with its corporate values.
An anonymous reader quotes a report from Bloomberg:
Although the century-old technology has disappeared from most people’s daily view, magnetic tape lives on as the preferred medium for safely archiving critical cloud data in case, say, a software bug deletes thousands of Gmail messages, or a natural disaster wipes out some hard drives. The world’s electronic financial, health, and scientific records, collected on state-of-the-art cloud servers belonging to Amazon.com, Microsoft, Google, and others, are also typically recorded on tape around the same time they are created. Usually the companies keep one copy of each tape on-site, in a massive vault, and send a second copy to somebody like Iron Mountain. Unfortunately for the big tech companies, the number of tape manufacturers has shrunk over the past three years from six to just two — Sony and Fujifilm — and each seems to think that’s still one too many.
Monica Nickelsburg / GeekWire:
Addressing employee concerns, Microsoft explains why it won’t pull out of Pentagon’s $10B JEDI contract bidding and will sell AI and other technologies to DOD — Microsoft will not pull out of the competition for a $10 billion cloud contract for the Department of Defense …
Emil Protalinski / VentureBeat:
Google updates Firebase with premium enterprise-grade support, better ML Kit facial recognition, Management API, a Test Lab for iOS, and more
Google today updated Firebase, its service for helping developers build apps for Android, iOS, and the web. Firebase has gained paid enterprise-grade support, ML Kit Face Contours, a Firebase Management API, Test Lab for iOS, Performance Monitoring improvements, and Firebase Predictions.
The announcements were made at the third annual Firebase Summit in Prague, where Google also noted that over 1.5 million apps are actively using Firebase every month. All of today’s updates are aimed at helping enterprise developers build better apps, improve app quality, and grow their app business.
The average business has around 14 improperly configured IaaS instances running at any given time and roughly one in every 20 AWS S3 buckets are left wide open to the public internet.
Let’s start with a basic premise that the vast majority of the world’s workloads remain in private data centers. Cloud infrastructure vendors are working hard to shift those workloads, but technology always moves a lot slower than we think. That is the lens through which many cloud companies operate.
The idea that you operate both on prem and in the cloud with multiple vendors is the whole idea behind the notion of the hybrid cloud. It’s where companies like Microsoft, IBM, Dell and Oracle are placing their bets.
Cloud-native computing developed in part to provide a single management fabric across on prem and cloud, freeing IT from having two sets of tools and trying somehow to bridge the gap between the two worlds.
Red Hat — you know, that company that was sold to IBM for $34 billion this week — has operated in this world.
As an example, it has built OpenShift, its version of Kubernetes. As CEO Jim Whitehurst told me last year, “Our hottest product is OpenShift. People talk about containers and they forget it’s a feature of Linux,” he said. That is an operating system that Red Hat knows a thing or two about.
With Red Hat in the fold, IBM can contend that being open source; they can build modern applications on top of open source tools and run them on IBM’s cloud or any of their competitors, a real hybrid approach.
Microsoft has a huge advantage here, of course, because it has a massive presence in the enterprise already.
Oracle brings similar value with its core database products. Companies using Oracle databases — just about everyone
Dell, which spent $67 billion for EMC, making the Red Hat purchase pale by comparison, has been trying to pull together a hybrid solution by combining VMware, Pivotal and Dell/EMC hardware.
You could argue that hybrid is a temporary state, that at some point, the vast majority of workloads will eventually be running in the cloud and the hybrid business as we know it today will continually shrink over time. We are certainly seeing cloud infrastructure revenue skyrocketing with no signs of slowing down as more workloads move to the cloud.
but you can see the trend is definitely up:
AWS reported revenue of $6.7 billion in revenue for the quarter, up from $4.58 billion the previous year.
Microsoft Intelligent Cloud, which incorporates things like Azure and server products and enterprise services, was at $8.6 billion, up from $6.9 billion.
IBM Technology Services and Cloud Platforms, which includes infrastructure services, technical support services and integration software reported revenue of $8.6 billion, up from $8.5 billion the previous year.
Others like Oracle and Google didn’t break out their cloud revenue.
Let’s start with a basic premise that the vast majority of the world’s workloads remain in private data centers. Cloud infrastructure vendors are working hard to shift those workloads, but technology always moves a lot slower than we think. That is the lens through which many cloud companies operate.
The idea that you operate both on prem and in the cloud with multiple vendors is the whole idea behind the notion of the hybrid cloud. It’s where companies like Microsoft, IBM, Dell and Oracle are placing their bets.
Azureen kannattaa nyt panostaa – Forbesin mukaan tämä Microsoftin yritys on nyt yksi suurimpia maailman pilviä. Azuren käyttäjinä ovat tällä hetkellä muun muassa Adobe, Heineken jaHP. Microsoftin mukaan myös 90% Fortune 500-yhtiöistä käyttää Azurea.
Jos Azure ei vakuuta, ehkä Amazon Web Services sopii tarpeisiisi paremmin.
Paul Alcorn / Tom’s Hardware:
AWS announces new AMD EPYC-powered cloud instances, purportedly 10% less expensive than other instance types, available today — We’re here at AMD’s New Horizon Event to bring you up to the minute news on the company’s 7nm products. This is breaking news, so check back frequently or refresh the page for updates.
AMD is expected to make major announcements about its new 7nm CPUs and GPUs. Intel continues to struggle with its 10nm manufacturing process, which is delayed until late 2019. If AMD can field 7nm processors early this year, it will mark the first time in the company’s history that it has had a process node leadership position over Intel. That should equate to faster, denser, and less power-hungry processors than Intel’s 14nm chips.
AMD hopes the Zen 2 processors will keep it ahead of or at parity with Intel, the world’s biggest maker of PC processors. The earlier Zen designs enabled chips that could process 52 percent more instructions per clock cycle than the previous generation.
Zen has spawned AMD’s most competitive chips in a decade, including Ryzen for the desktop, Threadripper (with up to 32 cores) for gamers, Ryzen Mobile for laptops, and Epyc for servers. In the future, you can expect to see Zen 2 cores in future models of those families of chips.
AMD’s focus is on making central processing units (CPUs), graphics processing units (GPUs), and accelerated processing units (APUs) that put the two other units together on the same chip.
“Zen 2 is our next-generation system architecture,” Su said, noting chips using it will be made with 7-nanometer manufacturing
Su said the new chips will be targeted for the workloads of the future, including machine learning, big data analytics, cloud, and other tasks. AMD is going after the $29 billion total available market for data center chips by 2021.
Bloomberg:
US GAO dismisses Oracle’s argument that the Pentagon’s $10B cloud contract violated federal standards and unfairly favored Amazon — – GAO decision deals setback to Oracle’s federal contracts push — Rejection frees Pentagon to pursue single-source solution
Oracle Corp. lost a challenge to the Pentagon’s $10 billion cloud contract, as the Government Accountability Office dismissed its argument that the winner-take-all contest violates federal procurement standards and unfairly favors Amazon.com Inc.
The GAO decision issued Wednesday deals a blow to Oracle’s push to expand its federal defense contracts, leaving the tech company with fewer options to improve its chances of winning the award. It also frees the Pentagon to pursue the single-source solution it has opted for all along.
“The Defense Department’s decision to pursue a single-award approach to obtain these cloud services is consistent with applicable statutes (and regulations) because the agency reasonably determined that a single-award approach is in the government’s best interests for various reasons, including national security concerns, as the statute allows,” Ralph White, the GAO’s managing associate general counsel for procurement law, said in a statement.
Organizations Need the Right Technologies and Talent in Place to Ensure a Secure Transition to the Cloud
In my previous column, I wrote about the evolution in security from hardware and point products, to an approach that increasingly relies on security DevOps. However, there is another transition that is also well underway – the shift to the cloud. The RightScale 2018 State of the Cloud Report finds that 96 percent of respondents use cloud, with public cloud adoption increasing to 92 percent from 89 percent in 2017.
I bet if you asked each of the 997 survey respondents to describe their use of the cloud you’d get 997 different answers. That’s because the move to the cloud comes in many different forms, each with its own set of implications for security teams. Here are just a few:
SaaS offerings: Services like Office 365, Google, Box, Dropbox and Salesforce are some of the most common services organizations rely on that are accessed through the cloud.
Employee cloud usage: Employees are using cloud services without ever involving IT. In the case of Shadow IT, these may be legitimate tools to help them get their jobs done. Other times they are using services simply for entertainment
SecOps in the cloud: According to Gartner, by 2019 more than 30% of the 100 largest vendors’ new software investments will have moved to cloud-only, and this includes investments in security technologies. If you are moving secOps to the cloud, there are many ramifications. Can the service address your bandwidth and oversight requirements?
Corporate services in the cloud: Many organizations are taking advantage of the cloud to respond to business opportunities and challenges with agility – adding new services as needed and rapidly expanding capacity during periods of peak demand. If your organization is among this group, there are some important questions to ask: What infrastructure, apps, and data are moving to the public cloud and when? Will shifting to the cloud introduce gaps in our defenses and, if so, what security precautions can we take?
Below are a few recommendations:
• Consider a Cloud Access Security Broker (CASB) which simplifies access management at scale. When a user leaves the organization or changes roles, access can be updated automatically across all cloud services through a single, easy to read pane.
• With more employees connecting to cloud apps directly through the internet, a Secure Internet Gateway offers visibility into internet activity across all locations, devices, and users, and blocks threats before they ever reach your network or endpoints.
• Firewall cloud solutions can protect cloud workloads as they expand, contract or shift location.
• IT and security professionals with a deep understanding of cloud can be hard to find. Even with various certifications, there’s no substitute for specific knowledge of the actual service.
• Your team has tremendous technical and institutional knowledge that you don’t want to lose, but they may not have other skills needed to support the transition to the cloud, such as knowledge of JSON and Python. Offer training
• While in-house staff comes up to speed, look for additional bench strength in the form of outsourced talent that can fill the skills gap and provide advisory and implementation services.
• Break the cycle of Shadow IT. As part of good security governance, architectural groups and committees should meet on a regular basis and include all key stakeholders from business, IT and security.
As you shift to the cloud, remember that this is a journey and that no two journeys are alike.
Amazon announced this week that a new feature designed to prevent data leaks has been added to Amazon Web Services (AWS).
Improperly configured Simple Storage Service (S3) buckets can expose an organization’s sensitive files, as demonstrated by several incidents involving companies such as Viacom, Verizon, Accenture, Booz Allen Hamilton, and Dow Jones.
As a result of numerous incidents, AWS last year introduced a new feature that alerts users of publicly accessible buckets, but researchers have still found data leaks resulting from misconfigured buckets.
Amazon S3 Block Public Access aims to address this by providing settings for blocking existing public access and ensuring that public access is not granted to new items.
“If an AWS account is used to host a data lake or another business application, blocking public access will serve as an account-level guard against accidental public exposure. Our goal is to make clear that public access is to be used for web hosting!” said Jeff Barr, Chief Evangelist for AWS.
The new settings can be accessed from the S3 console, the command-line interface (CLI) or the S3 APIs, and they allow users to manage public ACLs and public bucket policies.
It was a stormy Monday for cloud stocks today with the general trend pointing way down.
Wall Street was apparently disenchanted with just about every cloud company, large and small and in-between. Nobody, it seemed, was spared investor’s wrath:
Box was down 6.93 percent to $16.66
Workday was down 7.57 percent to $124.07
Twilio was down 13.76 percent to $76.90
Amazon (which of course includes AWS) was down 5.09 percent to $1,512.29
CNBC:
A look back at Diane Greene’s tenure as Google cloud boss: a struggle to catch up with AWS and Azure, and tension with Sundar Pichai on Github and Project Maven
Google poured resources into its cloud unit during Diane Greene’s three-year run at the helm, but the company has still struggled against Amazon and Microsoft.
Greene clashed with Google CEO Sundar Pichai over a big cloud contract with the Department of Defense.
Thomas Kurian, a former Oracle executive, has been named as Greene’s successor.
CNBC:
Sources reflect on Diane Greene’s tenure as Google cloud boss: struggling to compete with AWS and Azure, tension with Pichai on GitHub bid and Project Maven
Google poured resources into its cloud unit during Diane Greene’s three-year run at the helm, but the company has still struggled against Amazon and Microsoft.
Greene clashed with Google CEO Sundar Pichai over a big cloud contract with the Department of Defense.
Thomas Kurian, a former Oracle executive, has been named as Greene’s successor.
Tom Krazit / GeekWire:
AWS introduces its own custom-designed Arm server processor, AWS Graviton Processor, claims 45% lower costs for some workloads — After years of waiting for someone to design an Arm server processor that could work at scale on the cloud, Amazon Web Services just went ahead and designed its own.
GeekWire is reporting this week from Amazon’s signature cloud technology conference in Las Vegas, as the public cloud giant announces new products, partnerships and technology initiatives.
Earlier this year I told you about the AWS Nitro System and promised you that it would allow us to “deliver new instance types more quickly than ever in the months to come.” Since I made that promise we have launched memory-intensive R5 and R5d instances, high frequency z1d instances, burstable T3 instances, high memory instances with up to 12 TiB of memory, and AMD-powered M5a and R5a instances. The purpose-built hardware and the lightweight hypervisor that comprise the AWS Nitro System allow us to innovate more quickly while devoting virtually all of the power of the host hardware to the instances.
We acquired Annapurna Labs in 2015 after working with them on the first version of the AWS Nitro System. Since then we’ve worked with them to build and release two generations of ASICs (chips, not shoes) that now offload all EC2 system functions to Nitro, allowing 100% of the hardware to be devoted to customer instances. A few years ago the team started to think about building an Amazon-built custom CPU designed for cost-sensitive scale-out workloads.
Asha McLean / ZDNet:
Amazon announces AWS Global Accelerator to boost performance across regions and AWS Transit Gateway, a tool to simplify network architecture and reduce overhead — The AWS Global Accelerator is expected to boost performance of global workloads and the AWS Transit Gateway is aimed at simplifying network architecture.
The AWS Global Accelerator is expected to boost performance of global workloads and the AWS Transit Gateway is aimed at simplifying network architecture.
The AWS Global Accelerator has been touted by VP of global infrastructure at AWS Peter DeSantis as improving availability and performance for AWS customers’ end users.
Essentially, user traffic enters AWS Global Accelerator through the closest edge location. The accelerator then routes the user traffic to the closest healthy application endpoint within the global AWS network. Lastly, at the endpoint, the application response returns over the AWS global network and reaches the user through the optimal endpoint.
Users are directed to an AWS customers’ workload based on their geographic location, application health, and weights that the AWS customer can configure.
The new service also allocates static Anycast IP addresses.
In AWS Global Accelerator, customers are charged for each accelerator that is deployed and the amount of traffic in the dominant direction that flows through the accelerator. The company expects customers will typically set up one accelerator for each application, but more complex applications may require more than one.
Users will be charged a fixed hourly fee — $0.025 — for every accelerator that is running, over the standard Data Transfer rates.
Mary Jo Foley / ZDNet:
Microsoft details three independent root causes behind last week’s worldwide MFA outage that affected Azure, Office 365, Dynamics, and other Microsoft users — Microsoft has posted a root cause analysis of the multifactor authentication issue which hit a number of its customers worldwide last week.
Microsoft has posted a root cause analysis of the multifactor authentication issue which hit a number of its customers worldwide last week. Here’s what happened.
Microsoft’s Azure team has gone public with the root cause it discovered when investigating the November 19 worldwide multi-factor-authentication outage that plagued a number of its customers.
For 14 hours on November 19, Microsoft’s Azure Active Directory Multi-Factor Authentication (MFA) services were down for many. Because Office 365 and Dynamics users authenticate via this service, they also were affected.
The first root cause showed up as a latency issue in the MFA front-end’s communication to its cache services. The second was a race condition in processing responses from the MFA back-end server. These two causes were introduced in a code update rollout which began in some datacenters on Tuesday November 13 and completed in all datacenters by Friday November 16, Microsoft officials said.
A third identified root cause, which was triggered by the second, resulted in the MFA back-end being unable to process any further requests from the front-end, even though it seemed to be working fine based on Microsoft’s monitoring.
Brian Heater / TechCrunch:
Amazon debuts RoboMaker, a cloud-based service to let coders develop, test, and deploy their robotics applications using the open source Robot Operating System
Amazon’s kicking off Re:Invent week with the launch of AWS RoboMaker. The cloud-based service utilizes the widely deployed open-source software Robot Operating System (ROS) to offer developers a place to develop and test robotics applications.
RoboMaker essentially serves as a platform to help speed up the time-consuming robotics development process. Among the tools offered by the service are Amazon’s machine learning technologies and analytics that help create a simulation for real-world robotics development.
Yes, everyone shopped their wallets dry on Amazon during the big holiday sales push. But don’t forget that the AWS public cloud side of Amazon is likely to drive valuation and operating income going forward
Amazon Web Services (AWS) provides numerous benefits to customers, allowing companies to be more responsive, available, and cost-efficient. It also provides a number of security capabilities, including strong identity and access management, granular activity logs, and strong policy enforcement.
However, that doesn’t mean you shouldn’t worry about security in your AWS environment. Simple speaking, AWS provides enough flexibility for you to shoot yourself in the foot if you aren’t careful. Gartner estimates that through 2022, at least 95 percent of cloud security failures will be the customer’s fault. Of course, AWS invented the now-famous shared responsibility model to educate customers on these risks and their role in protecting their workloads.
Tonight at AWS re:Invent, the company announced a new tool called AWS Transit Gateway designed to help build a network topology inside of AWS that lets you share resources across accounts and bring together on premises and cloud resources in a single network topology.
Amazon already has a popular product called Amazon Virtual Private Cloud (VPC), which helps customers build private instances of their applications. The Transit Gateway is designed to help build connections between VPCs, which, up until now, has been tricky to do.
During Andy Jassy’s keynote, they announced a service called AWS Ground Station, which allows you to talk to satellites on a pay-as-you-go, basis.
Created almost twenty years ago the CubeSat standard has lowered the barrier to entry
So far more than 700 CubeSats have made their way to orbit
All those satellites in orbit means that there needs to be a corresponding build out of ground station capability, and yesterday’s announcement—which was made in partnership with Lockheed Martin and will be part of their “Verge” ground station network
Google this week announced the general availability of secure LDAP, after introducing the capability in October at Next ’18 London.
Allowing customers to manage access to traditional LDAP-based apps and IT infrastructure, it can be used with either G Suite or Cloud Identity, Google’s managed identity and access management (IAM) platform.
Secure LDAP, the Internet search giant explains, supports management of access to both software-as-a-service (SaaS) apps and traditional LDAP-based apps/infrastructure, regardless of whether on-premises or in the cloud, via a single IAM platform.
Secure LDAP enables authentication, authorization, and user/group lookups and, because the same user directory is used for both SaaS and LDAP apps, logging into services like G Suite and other SaaS apps is similar to that for traditional applications.
Amazon Web Services on Wednesday announced the launch of AWS Security Hub, a service designed to aggregate and prioritize alerts from AWS and third-party security tools.
Unveiled at the AWS re:Invent 2018 conference, AWS Security Hub provides organizations a comprehensive view of their security status by consuming, aggregating, organizing and prioritizing data from Amazon GuardDuty, Amazon Inspector, Amazon Macie, and tools from AWS partners.
A significant number of cybersecurity firms announced on Wednesday that their products can be integrated with the AWS Security Hub, including CrowdStrike, Twistlock, Tenable, Armor, McAfee, Splunk, Check Point, Palo Alto Networks, Alert Logic, Qualys, Sophos, Trend Micro, Sumo Logic and Fortinet. Each of these companies issued statements, press releases and blog posts regarding the partnership with AWS.
Amazon.com Inc. will let customers put servers used in the company’s cloud-computing data centers into their own facilities, an effort to reach businesses that want to store some of their technology functions in the cloud while keeping tighter control of others.
The move announced Wednesday by Amazon Web Services Chief Executive Officer Andy Jassy helps provide hybrid-cloud strategies desired by many larger enterprise customers in an area where cloud-competitor Microsoft Corp. is making headway.
We use cookies to ensure that we give you the best experience on our website. If you continue to use this site we will assume that you are happy with it.
We are a professional review site that has advertisement and can receive compensation from the companies whose products we review. We use affiliate links in the post so if you use them to buy products through those links we can get compensation at no additional cost to you.OkDecline
205 Comments
Tomi Engdahl says:
http://www.etn.fi/index.php/13-news/8436-azure-on-selvasti-kaytetyin-pilvialusta-suomessa
Tomi Engdahl says:
Why the Pentagon’s $10 billion JEDI deal has cloud companies going nuts
https://techcrunch.com/2018/09/15/why-the-pentagons-10-billion-jedi-deal-has-cloud-companies-going-nuts/?utm_source=tcfbpage&sr_share=facebook
Defense Department’s massive winner-take-all $10 billion cloud contract dubbed the Joint Enterprise Defense Infrastructure (or JEDI for short).
Star Wars references aside, this contract is huge, even by government standards.The Pentagon would like a single cloud vendor to build out its enterprise cloud, believing rightly or wrongly that this is the best approach to maintain focus and control of their cloud strategy.
Tomi Engdahl says:
Nikhilesh De / CoinDesk:
Cloudflare debuts IPFS Gateway, an easy way to access content from the P2P-driven InterPlanetary File System that does not require installing special software — Internet security provider Cloudflare is introducing a new product to help users more easily access the InterPlanetary File System …
Cloudflare Launches Decentralized Web Gateway at Its First ‘Crypto Week’
https://www.coindesk.com/cloudflare-launches-decentralized-web-gateway-at-its-first-crypto-week/
Internet security provider Cloudflare is introducing a new product to help users more easily access the InterPlanetary File System (IPFS), the decentralized storage protocol developed by Protocol Labs.
In a blog post Monday, Cloudflare announced it was launching a “Crypto Week,” where it will announce “support for a new technology that uses cryptography to make the internet better” every day. The first of these technologies is a portal to more easily access IPFS, as well as build websites on top of the technology.
In a second post, the company explains that the peer-to-peer nature of IPFS provides a number of redundancies for users trying to access a specific website or piece of data.
The second feature revolves around the fact that users can request data using hash values, rather than IP addresses
To ensure users can access data stored through IPFS, Cloudflare is offering a gateway which delivers content using Hyper Text Transfer Protocol Secure (HTTPS)
Tomi Engdahl says:
http://www.etn.fi/index.php/13-news/8439-pelottaako-kuvien-varmuuskopiointi-verkkoon-harkitse-omaa-pilvea
Tomi Engdahl says:
Cloud adoption drives hyperscale data centers
https://www.csemag.com/single-article/cloud-adoption-drives-hyperscale-data-centers/17c19944fc8dc6b013d6b5f2823d03c2.html?OCVALIDATE=
This change in data center landscape ultimately is a result of the mass emergence of the cloud in 2016.
Consolidation, consolidation, consolidation. Not in terms of just organizational consolidations, but also in terms of data centers. Several reports started to use the phrase “hyperscale” to describe the new outsourced data centers that are used by cloud providers. Gartner predicts “by 2025, 80% of enterprises will have shut down their traditional data center,” versus only 10% today.(1) What does this mean?
Tomi Engdahl says:
Microsoft Boosts Azure Security With Array of New Tools
https://www.securityweek.com/microsoft-boosts-azure-security-array-new-tools
At its Ignite conference this week, Microsoft announced improved security features for Azure with the addition of Microsoft Authenticator, Azure Firewall, and several other tools to the cloud computing platform.
After announcing Azure Active Directory (AD) Password Protection in June to combat bad passwords, Microsoft is now bringing password-less logins to Azure AD connected apps with the addition of support for Microsoft Authenticator.
Tomi Engdahl says:
Ujjwal Pugalia / Amazon Web Services:
AWS announces that YubiKey can now be used as a multi-factor authentication (MFA) device to sign into the AWS management console — AWS Identity and Access Management (IAM) best practice is to require all IAM and root users in your account to sign into the AWS Management Console with multi-factor authentication (MFA).
Use YubiKey security key to sign into AWS Management Console with YubiKey for multi-factor authentication
https://aws.amazon.com/blogs/security/use-yubikey-security-key-sign-into-aws-management-console/
Tomi Engdahl says:
Mike Wheatley / SiliconANGLE:
AWS rolls out new EC2 high memory instances for in-memory databases like SAP HANA, with 6, 9, and 12 TB memory, plans instances with 18 TB and 24 TB for 2019 — Amazon Web Services Inc. today offered some revved-up cloud computing instances aimed at customers running resource-intensive “in-memory” databases such as SAP SE’s HANA.
Amazon rolls out High Memory Instances for in-memory databases
https://siliconangle.com/2018/09/27/amazon-rolls-high-memory-instances-memory-databases/
Amazon Web Services Inc. today offered some revved-up cloud computing instances aimed at customers running resource-intensive “in-memory” databases such as SAP SE’s HANA.
The new High Memory EC2 instances offer a choice of 6, 9 and 12 terabytes of memory, with 18TB and 24TB options to arrive next year.
AWS Chief Evangelist Jeff Barr said in a blog post that the idea is to allow customers to run in-memory databases, which keep entire datasets in short-term RAM memory, closer to the enterprise applications they interact with.
Now Available – Amazon EC2 High Memory Instances with 6, 9, and 12 TB of Memory, Perfect for SAP HANA
https://aws.amazon.com/blogs/aws/now-available-amazon-ec2-high-memory-instances-with-6-9-and-12-tb-of-memory-perfect-for-sap-hana/
Tomi Engdahl says:
HOW THE PENTAGON’S MOVE TO THE CLOUD LANDED IN THE MUD
https://www.wired.com/story/how-pentagons-move-to-cloud-landed-in-mud/
THE PENTAGON DIDN’T mince words in March when introducing the Joint Enterprise Defense Initiative, or JEDI Cloud program: “This program is truly about increasing the lethality of our department and providing the best resources to our men and women in uniform,” the Defense Department’s chief management officer, John H. Gibson II, told industry leaders and academics at a public event. JEDI aims to bring DOD’s computing systems into the 21st century by moving them into the cloud. The Pentagon says that will help it inject artificial intelligence into its data analysis and equip soldiers with real-time data during missions, among other benefits. For the winning bidder, the contract will be lucrative: Bloomberg Government estimates it will be worth $10 billion over the next decade.
Or at least that’s how it should have worked.
the story of the JEDI contract reads more like a bizarre corporate thriller than a simple cloud-computing deal.
Case in point: Late Monday evening, Google announced it no longer plans to submit a bid by Friday, the deadline for interested applicants.
It’s a reasonable assessment, given that developing the JEDI program would conflict with Google’s recently codified stance against developing or deploying artificial intelligence technologies that “cause or are likely to cause overall harm,” as the digital backbone of the Pentagon will certainly do in one way or another. However, it was also a bit of a gimme, as Google probably wouldn’t have won the contract anyway.
That’s because Amazon has been considered the clear front-runner for the JEDI Cloud deal since the details of the contract were first released.
The Pentagon’s 1,375-page winner-take-all request for proposal for JEDI is a web of restrictions and requirements that some critics allege leaves few viable candidates beyond Amazon.
Tomi Engdahl says:
https://www.tivi.fi/Kaikki_uutiset/amazon-saamassa-10-miljardin-jattidiilin-kilpailijat-haistavat-palaneen-karya-huonoa-kayttoa-verodollareille-6744732
Amazon likely to win $10B Pentagon contract – Google, IBM, and Oracle aren’t happy
https://thenextweb.com/google/2018/10/11/amazon-likely-to-win-10b-pentagon-contract-google-ibm-and-oracle-arent-happy/
The Pentagon announced the Joint Enterprise Defense Infrastructure (JEDI) project earlier this year, prompting a feeding frenzy among big tech companies eager to get the lucrative contract. Appetites have curbed a bit amid accusations the JEDI contract wasn’t ever up for grabs.
Project JEDI is a $10 billion defense contract for a single commercial provider to build a cloud computing platform to support weapons systems and classified data storage.
Tomi Engdahl says:
JEDI: Why We’re Protesting
https://www.ibm.com/blogs/policy/jedi-protest/
The U.S. Department of Defense faces a critically important challenge as it solicits bids for JEDI, the Joint Enterprise Defense Infrastructure. JEDI is intended to modernize and consolidate the defense department’s IT systems into an enterprise-level commercial cloud.
The importance of this transition cannot be overstated: JEDI will be the foundation for integrating advanced technologies such as artificial intelligence and augmented reality into America’s warfighting capability.
Unfortunately, JEDI, as outlined in the final solicitation, would not provide the strongest possible foundation for the 21st century battlefield.
For that reason, IBM today filed a protest of the JEDI solicitation with the U.S. Government Accountability Office.
– IBM knows what it takes to build a world-class cloud. No business in the world would build a cloud the way JEDI would and then lock in to it for a decade. JEDI turns its back on the preferences of Congress and the administration, is a bad use of taxpayer dollars and was written with just one company in mind. America’s warfighters deserve better. –
Tomi Engdahl says:
Alexa, remind me to launch nukes
Amazon likely to win $10B Pentagon contract – Google, IBM, and Oracle aren’t happy
https://thenextweb.com/google/2018/10/11/amazon-likely-to-win-10b-pentagon-contract-google-ibm-and-oracle-arent-happy/
The Pentagon announced the Joint Enterprise Defense Infrastructure (JEDI) project earlier this year, prompting a feeding frenzy among big tech companies eager to get the lucrative contract. Appetites have curbed a bit amid accusations the JEDI contract wasn’t ever up for grabs.
Project JEDI is a $10 billion defense contract for a single commercial provider to build a cloud computing platform to support weapons systems and classified data storage.
Tomi Engdahl says:
Khari Johnson / VentureBeat:
GitHub debuts Actions for devs to automate workflows and build, share, and execute code inside containers, security alerts for Java and .NET projects, and more
GitHub launches Actions to execute code in containers and security alerts for Java and .NET projects
https://venturebeat.com/2018/10/16/github-launches-actions-to-execute-code-in-containers-and-security-alerts-for-java-and-net-projects/
The GitHub code repository, which has been used by 31 million developers around the world in the past year, today announced a sweeping series of changes, including Actions, a new way for developers to automate workflows and build, share, and execute code inside containers on GitHub.
In a phone interview with VentureBeat, GitHub head of platform Sam Lambert called Actions the “biggest thing we’ve done since the pull request” and compared it to Shortcuts for automating workflows introduced for iOS 12 last month.
Actions will be made available in limited public beta for Developer, Team, and Business Cloud plans on GitHub. They’re designed to make it possible for any team to adopt the best workflows
“A lot of the major clouds have built products for sysadmins and not really for developers, and we want to hand power and flexibility back to the developer
A number of security measures were also made available today, including the Security Advisory API for access to all vulnerabilities found by GitHub for integration into your existing tools and services.
Security vulnerability alerts for Java and .NET code were introduced today to deliver automated notifications and insights into how to fix issues with your code. Proactive security alerts were first introduced last year for Ruby, JavaScript, and Python.
Also new: token scanning for public repositories. This feature allows GitHub to alert a developer or even their cloud provider if secret keys or passwords are, for example, accidentally pushed into a public channel.
Tomi Engdahl says:
https://lamia.fi/blog/google-cloud-platform-tuli-suomeen?utm_source=Facebook&utm_medium=Post%20&utm_campaign=GCP
Tomi Engdahl says:
Battling a multi-cloud deployment? Get our expert IT advice in this handy catch-up video
There are many mistakes you can make – learn how to avoid them from gurus
https://www.theregister.co.uk/2018/10/17/multicloud_strategy_webcast/
Tomi Engdahl says:
Fujitsu: We love Microsoft Azure, we’re training 10,000 bods on it
Er, one-tenth of the human capital AWS is throwing at its public cloud
https://www.theregister.co.uk/2018/10/16/fujitsu_microsoft_azure_training_staff/
Fujitsu may have given up the ghost on its own K5 cloud, but it is promising to throw a ton of human resources at selling and managing Microsoft Azure – its public cloud service of choice.
As revealed yesterday, Fujitsu decided to kill the platform in all regions outside of Japan with immediate effect: K5 was sold and deployed as public, private, private virtual or private hosted services. Sales were clearly off-target with no recovery in sight.
A spokesman at the company told us it is selling multiple clouds from vendors including VMware, Oracle, SAP and Microsoft Azure – though it has yet to start flogging Azure Stack.
To put Fujitsu’s efforts into some kind of context, AWS earlier this year said at its Public Sector Summit that it planned to train 100,000 heads this year to keep expanding its tendrils.
The cloud services arm of Amazon said that in Europe alone some 350,000 techies are required, “and we need to help fill those job skills – and quickly”, said Teresa Carlson, veep of worldwide public sector.
Fujitsu said it already uses Azure to host some digital services in “areas as diverse as digital ticketing, legacy modernisation and AI”.
Tomi Engdahl says:
Jeff Bezos attacks Google for refusing US military contracts: ‘This is a great country and it does need to be defended’
https://nordic.businessinsider.com/jeff-bezos-attacks-google-defends-bids-us-military-contracts-2018-10?utm_source=facebook&utm_medium=referral&utm_content=topbar&utm_term=desktop&referrer=facebook&r=US&IR=T
Amazon CEO Jeff Bezos has defended his company’s bids for US military contracts.
“This is a great country and it does need to be defended,” he said at a conference on Monday.
His remarks are an implicit rebuke of Google, which has recently dropped out of the running for the $10 billion JEDI defense contract, saying that it could conflict with its corporate values.
Tomi Engdahl says:
The Future of the Cloud Depends On Magnetic Tape
https://hardware.slashdot.org/story/18/10/17/2314229/the-future-of-the-cloud-depends-on-magnetic-tape?utm_source=feedburner&utm_medium=feed&utm_campaign=Feed%3A+Slashdot%2Fslashdot%2Fto+%28%28Title%29Slashdot+%28rdf%29%29
An anonymous reader quotes a report from Bloomberg:
Although the century-old technology has disappeared from most people’s daily view, magnetic tape lives on as the preferred medium for safely archiving critical cloud data in case, say, a software bug deletes thousands of Gmail messages, or a natural disaster wipes out some hard drives. The world’s electronic financial, health, and scientific records, collected on state-of-the-art cloud servers belonging to Amazon.com, Microsoft, Google, and others, are also typically recorded on tape around the same time they are created. Usually the companies keep one copy of each tape on-site, in a massive vault, and send a second copy to somebody like Iron Mountain. Unfortunately for the big tech companies, the number of tape manufacturers has shrunk over the past three years from six to just two — Sony and Fujifilm — and each seems to think that’s still one too many.
Tomi Engdahl says:
Monica Nickelsburg / GeekWire:
Addressing employee concerns, Microsoft explains why it won’t pull out of Pentagon’s $10B JEDI contract bidding and will sell AI and other technologies to DOD — Microsoft will not pull out of the competition for a $10 billion cloud contract for the Department of Defense …
Microsoft defends bid for $10B Pentagon cloud contract amid criticism over government use of technology
https://www.geekwire.com/2018/microsoft-defends-bid-10b-dod-cloud-contract-amid-criticism-government-use-technology/
Tomi Engdahl says:
How to install FreeBSD 11 on Google Cloud Compute
https://www.cyberciti.biz/faq/howto-deploying-freebsd11-unix-on-google-cloud/
Tomi Engdahl says:
Emil Protalinski / VentureBeat:
Google updates Firebase with premium enterprise-grade support, better ML Kit facial recognition, Management API, a Test Lab for iOS, and more
Google updates Firebase with enterprise-grade support, ML Kit Face Contours, Management API, and more
https://venturebeat.com/2018/10/29/google-updates-firebase-with-enterprise-grade-support-ml-kit-face-contours-management-api-and-more/
Google today updated Firebase, its service for helping developers build apps for Android, iOS, and the web. Firebase has gained paid enterprise-grade support, ML Kit Face Contours, a Firebase Management API, Test Lab for iOS, Performance Monitoring improvements, and Firebase Predictions.
The announcements were made at the third annual Firebase Summit in Prague, where Google also noted that over 1.5 million apps are actively using Firebase every month. All of today’s updates are aimed at helping enterprise developers build better apps, improve app quality, and grow their app business.
Tomi Engdahl says:
McAfee says cloud security not as bad as we feared… it’s much worse
Quick takeaway: most everyone sucks at IaaS
https://www.theregister.co.uk/2018/10/30/mcafee_cloud_security_terrible/
The average business has around 14 improperly configured IaaS instances running at any given time and roughly one in every 20 AWS S3 buckets are left wide open to the public internet.
Tomi Engdahl says:
https://www.tivi.fi/CIO/yritysten-herkka-data-pysyy-pilvessa-turvassa-nama-ovat-3-tarkeinta-syyta-6746873
Tomi Engdahl says:
The hybrid cloud market just got a heck of a lot more compelling
https://techcrunch.com/2018/10/30/the-hybrid-cloud-market-just-got-a-heck-of-a-lot-more-compelling/?sr_share=facebook&utm_source=tcfbpage
Let’s start with a basic premise that the vast majority of the world’s workloads remain in private data centers. Cloud infrastructure vendors are working hard to shift those workloads, but technology always moves a lot slower than we think. That is the lens through which many cloud companies operate.
The idea that you operate both on prem and in the cloud with multiple vendors is the whole idea behind the notion of the hybrid cloud. It’s where companies like Microsoft, IBM, Dell and Oracle are placing their bets.
Cloud-native computing developed in part to provide a single management fabric across on prem and cloud, freeing IT from having two sets of tools and trying somehow to bridge the gap between the two worlds.
Red Hat — you know, that company that was sold to IBM for $34 billion this week — has operated in this world.
As an example, it has built OpenShift, its version of Kubernetes. As CEO Jim Whitehurst told me last year, “Our hottest product is OpenShift. People talk about containers and they forget it’s a feature of Linux,” he said. That is an operating system that Red Hat knows a thing or two about.
With Red Hat in the fold, IBM can contend that being open source; they can build modern applications on top of open source tools and run them on IBM’s cloud or any of their competitors, a real hybrid approach.
Microsoft has a huge advantage here, of course, because it has a massive presence in the enterprise already.
Oracle brings similar value with its core database products. Companies using Oracle databases — just about everyone
Dell, which spent $67 billion for EMC, making the Red Hat purchase pale by comparison, has been trying to pull together a hybrid solution by combining VMware, Pivotal and Dell/EMC hardware.
You could argue that hybrid is a temporary state, that at some point, the vast majority of workloads will eventually be running in the cloud and the hybrid business as we know it today will continually shrink over time. We are certainly seeing cloud infrastructure revenue skyrocketing with no signs of slowing down as more workloads move to the cloud.
but you can see the trend is definitely up:
AWS reported revenue of $6.7 billion in revenue for the quarter, up from $4.58 billion the previous year.
Microsoft Intelligent Cloud, which incorporates things like Azure and server products and enterprise services, was at $8.6 billion, up from $6.9 billion.
IBM Technology Services and Cloud Platforms, which includes infrastructure services, technical support services and integration software reported revenue of $8.6 billion, up from $8.5 billion the previous year.
Others like Oracle and Google didn’t break out their cloud revenue.
Tomi Engdahl says:
Using AWS F1 FPGA Acceleration
https://www.hackster.io/adam-taylor/using-aws-f1-fpga-acceleration-d5563b
How to create an AWS instance that allows us to develop and accelerate applications using FPGAs in the cloud.
Tomi Engdahl says:
The hybrid cloud market just got a heck of a lot more compelling
https://techcrunch.com/2018/10/30/the-hybrid-cloud-market-just-got-a-heck-of-a-lot-more-compelling/?sr_share=facebook&utm_source=tcfbpage
Let’s start with a basic premise that the vast majority of the world’s workloads remain in private data centers. Cloud infrastructure vendors are working hard to shift those workloads, but technology always moves a lot slower than we think. That is the lens through which many cloud companies operate.
The idea that you operate both on prem and in the cloud with multiple vendors is the whole idea behind the notion of the hybrid cloud. It’s where companies like Microsoft, IBM, Dell and Oracle are placing their bets.
Tomi Engdahl says:
Pilvien sota
http://blog.tieturi.fi/2018/10/pilvien-sota.html?m=1
Azureen kannattaa nyt panostaa – Forbesin mukaan tämä Microsoftin yritys on nyt yksi suurimpia maailman pilviä. Azuren käyttäjinä ovat tällä hetkellä muun muassa Adobe, Heineken jaHP. Microsoftin mukaan myös 90% Fortune 500-yhtiöistä käyttää Azurea.
Jos Azure ei vakuuta, ehkä Amazon Web Services sopii tarpeisiisi paremmin.
Tomi Engdahl says:
Paul Alcorn / Tom’s Hardware:
AWS announces new AMD EPYC-powered cloud instances, purportedly 10% less expensive than other instance types, available today — We’re here at AMD’s New Horizon Event to bring you up to the minute news on the company’s 7nm products. This is breaking news, so check back frequently or refresh the page for updates.
AMD Announces 64-Core 7nm Rome CPUs, 7nm MI60 GPUs, And Zen 4
https://www.tomshardware.com/news/amd-new-horizon-7nm-cpu,38029.html
AMD is expected to make major announcements about its new 7nm CPUs and GPUs. Intel continues to struggle with its 10nm manufacturing process, which is delayed until late 2019. If AMD can field 7nm processors early this year, it will mark the first time in the company’s history that it has had a process node leadership position over Intel. That should equate to faster, denser, and less power-hungry processors than Intel’s 14nm chips.
Dean Takahashi / VentureBeat:
AMD announces the Zen 2 architecture for the upcoming family of 7nm chips, which it says will start shipping in 2019
AMD reveals Zen 2 processor architecture in bid to stay ahead of Intel
https://venturebeat.com/2018/11/06/amd-reveals-zen-2-processor-architecture-in-bid-to-stay-ahead-of-intel/
AMD hopes the Zen 2 processors will keep it ahead of or at parity with Intel, the world’s biggest maker of PC processors. The earlier Zen designs enabled chips that could process 52 percent more instructions per clock cycle than the previous generation.
Zen has spawned AMD’s most competitive chips in a decade, including Ryzen for the desktop, Threadripper (with up to 32 cores) for gamers, Ryzen Mobile for laptops, and Epyc for servers. In the future, you can expect to see Zen 2 cores in future models of those families of chips.
AMD’s focus is on making central processing units (CPUs), graphics processing units (GPUs), and accelerated processing units (APUs) that put the two other units together on the same chip.
“Zen 2 is our next-generation system architecture,” Su said, noting chips using it will be made with 7-nanometer manufacturing
Su said the new chips will be targeted for the workloads of the future, including machine learning, big data analytics, cloud, and other tasks. AMD is going after the $29 billion total available market for data center chips by 2021.
Tomi Engdahl says:
Google launches Cloud Scheduler, a managed cron service
https://techcrunch.com/2018/11/06/google-launches-cloud-scheduler-a-managed-cron-service/?utm_source=tcfbpage&sr_share=facebook
Tomi Engdahl says:
https://www.salesforce.com/fi/blog/2018/pilvi-on-uusi-normaali.html
Tomi Engdahl says:
Bloomberg:
US GAO dismisses Oracle’s argument that the Pentagon’s $10B cloud contract violated federal standards and unfairly favored Amazon — – GAO decision deals setback to Oracle’s federal contracts push — Rejection frees Pentagon to pursue single-source solution
Oracle Loses Protest of Pentagon Cloud Bid Seen Favoring Amazon
https://www.bloomberg.com/news/articles/2018-11-14/oracle-loses-protest-of-pentagon-cloud-bid-seen-favoring-amazon
Oracle Corp. lost a challenge to the Pentagon’s $10 billion cloud contract, as the Government Accountability Office dismissed its argument that the winner-take-all contest violates federal procurement standards and unfairly favors Amazon.com Inc.
The GAO decision issued Wednesday deals a blow to Oracle’s push to expand its federal defense contracts, leaving the tech company with fewer options to improve its chances of winning the award. It also frees the Pentagon to pursue the single-source solution it has opted for all along.
“The Defense Department’s decision to pursue a single-award approach to obtain these cloud services is consistent with applicable statutes (and regulations) because the agency reasonably determined that a single-award approach is in the government’s best interests for various reasons, including national security concerns, as the statute allows,” Ralph White, the GAO’s managing associate general counsel for procurement law, said in a statement.
Tomi Engdahl says:
What Does Your Cloud Strategy Include, and Are You Transitioning Securely?
https://www.securityweek.com/what-does-your-cloud-strategy-include-and-are-you-transitioning-securely
Organizations Need the Right Technologies and Talent in Place to Ensure a Secure Transition to the Cloud
In my previous column, I wrote about the evolution in security from hardware and point products, to an approach that increasingly relies on security DevOps. However, there is another transition that is also well underway – the shift to the cloud. The RightScale 2018 State of the Cloud Report finds that 96 percent of respondents use cloud, with public cloud adoption increasing to 92 percent from 89 percent in 2017.
I bet if you asked each of the 997 survey respondents to describe their use of the cloud you’d get 997 different answers. That’s because the move to the cloud comes in many different forms, each with its own set of implications for security teams. Here are just a few:
SaaS offerings: Services like Office 365, Google, Box, Dropbox and Salesforce are some of the most common services organizations rely on that are accessed through the cloud.
Employee cloud usage: Employees are using cloud services without ever involving IT. In the case of Shadow IT, these may be legitimate tools to help them get their jobs done. Other times they are using services simply for entertainment
SecOps in the cloud: According to Gartner, by 2019 more than 30% of the 100 largest vendors’ new software investments will have moved to cloud-only, and this includes investments in security technologies. If you are moving secOps to the cloud, there are many ramifications. Can the service address your bandwidth and oversight requirements?
Corporate services in the cloud: Many organizations are taking advantage of the cloud to respond to business opportunities and challenges with agility – adding new services as needed and rapidly expanding capacity during periods of peak demand. If your organization is among this group, there are some important questions to ask: What infrastructure, apps, and data are moving to the public cloud and when? Will shifting to the cloud introduce gaps in our defenses and, if so, what security precautions can we take?
Below are a few recommendations:
• Consider a Cloud Access Security Broker (CASB) which simplifies access management at scale. When a user leaves the organization or changes roles, access can be updated automatically across all cloud services through a single, easy to read pane.
• With more employees connecting to cloud apps directly through the internet, a Secure Internet Gateway offers visibility into internet activity across all locations, devices, and users, and blocks threats before they ever reach your network or endpoints.
• Firewall cloud solutions can protect cloud workloads as they expand, contract or shift location.
• IT and security professionals with a deep understanding of cloud can be hard to find. Even with various certifications, there’s no substitute for specific knowledge of the actual service.
• Your team has tremendous technical and institutional knowledge that you don’t want to lose, but they may not have other skills needed to support the transition to the cloud, such as knowledge of JSON and Python. Offer training
• While in-house staff comes up to speed, look for additional bench strength in the form of outsourced talent that can fill the skills gap and provide advisory and implementation services.
• Break the cycle of Shadow IT. As part of good security governance, architectural groups and committees should meet on a regular basis and include all key stakeholders from business, IT and security.
As you shift to the cloud, remember that this is a journey and that no two journeys are alike.
Tomi Engdahl says:
AWS Adds New Feature for Preventing Data Leaks
https://www.securityweek.com/aws-adds-new-feature-preventing-data-leaks
Amazon announced this week that a new feature designed to prevent data leaks has been added to Amazon Web Services (AWS).
Improperly configured Simple Storage Service (S3) buckets can expose an organization’s sensitive files, as demonstrated by several incidents involving companies such as Viacom, Verizon, Accenture, Booz Allen Hamilton, and Dow Jones.
As a result of numerous incidents, AWS last year introduced a new feature that alerts users of publicly accessible buckets, but researchers have still found data leaks resulting from misconfigured buckets.
Amazon S3 Block Public Access aims to address this by providing settings for blocking existing public access and ensuring that public access is not granted to new items.
“If an AWS account is used to host a data lake or another business application, blocking public access will serve as an account-level guard against accidental public exposure. Our goal is to make clear that public access is to be used for web hosting!” said Jeff Barr, Chief Evangelist for AWS.
The new settings can be accessed from the S3 console, the command-line interface (CLI) or the S3 APIs, and they allow users to manage public ACLs and public bucket policies.
https://aws.amazon.com/blogs/aws/amazon-s3-block-public-access-another-layer-of-protection-for-your-accounts-and-buckets/
Tomi Engdahl says:
Cloud stocks take a beating
https://techcrunch.com/2018/11/19/cloud-stocks-take-a-beating/?utm_source=tcfbpage&sr_share=facebook
It was a stormy Monday for cloud stocks today with the general trend pointing way down.
Wall Street was apparently disenchanted with just about every cloud company, large and small and in-between. Nobody, it seemed, was spared investor’s wrath:
Box was down 6.93 percent to $16.66
Workday was down 7.57 percent to $124.07
Twilio was down 13.76 percent to $76.90
Amazon (which of course includes AWS) was down 5.09 percent to $1,512.29
Tomi Engdahl says:
CNBC:
A look back at Diane Greene’s tenure as Google cloud boss: a struggle to catch up with AWS and Azure, and tension with Sundar Pichai on Github and Project Maven
Google’s cloud business under Greene was plagued by internal clashes, missed acquisitions, insiders say
https://www.cnbc.com/2018/11/21/google-cloud-plagued-by-internal-clashes-in-its-effort-to-catch-amazon.html
Google poured resources into its cloud unit during Diane Greene’s three-year run at the helm, but the company has still struggled against Amazon and Microsoft.
Greene clashed with Google CEO Sundar Pichai over a big cloud contract with the Department of Defense.
Thomas Kurian, a former Oracle executive, has been named as Greene’s successor.
Tomi Engdahl says:
CNBC:
Sources reflect on Diane Greene’s tenure as Google cloud boss: struggling to compete with AWS and Azure, tension with Pichai on GitHub bid and Project Maven
Google’s cloud business under Greene was plagued by internal clashes, missed acquisitions, insiders say
https://www.cnbc.com/2018/11/21/google-cloud-plagued-by-internal-clashes-in-its-effort-to-catch-amazon.html
Google poured resources into its cloud unit during Diane Greene’s three-year run at the helm, but the company has still struggled against Amazon and Microsoft.
Greene clashed with Google CEO Sundar Pichai over a big cloud contract with the Department of Defense.
Thomas Kurian, a former Oracle executive, has been named as Greene’s successor.
Tomi Engdahl says:
Tom Krazit / GeekWire:
AWS introduces its own custom-designed Arm server processor, AWS Graviton Processor, claims 45% lower costs for some workloads — After years of waiting for someone to design an Arm server processor that could work at scale on the cloud, Amazon Web Services just went ahead and designed its own.
Amazon Web Services introduces its own custom-designed Arm server processor, promises 45 percent lower costs for some workloads
https://www.geekwire.com/2018/amazon-web-services-introduces-custom-designed-arm-server-processor-promises-45-percent-lower-costs-workloads/
GeekWire is reporting this week from Amazon’s signature cloud technology conference in Las Vegas, as the public cloud giant announces new products, partnerships and technology initiatives.
New – EC2 Instances (A1) Powered by Arm-Based AWS Graviton Processors
https://aws.amazon.com/blogs/aws/new-ec2-instances-a1-powered-by-arm-based-aws-graviton-processors/
Earlier this year I told you about the AWS Nitro System and promised you that it would allow us to “deliver new instance types more quickly than ever in the months to come.” Since I made that promise we have launched memory-intensive R5 and R5d instances, high frequency z1d instances, burstable T3 instances, high memory instances with up to 12 TiB of memory, and AMD-powered M5a and R5a instances. The purpose-built hardware and the lightweight hypervisor that comprise the AWS Nitro System allow us to innovate more quickly while devoting virtually all of the power of the host hardware to the instances.
We acquired Annapurna Labs in 2015 after working with them on the first version of the AWS Nitro System. Since then we’ve worked with them to build and release two generations of ASICs (chips, not shoes) that now offload all EC2 system functions to Nitro, allowing 100% of the hardware to be devoted to customer instances. A few years ago the team started to think about building an Amazon-built custom CPU designed for cost-sensitive scale-out workloads.
Tomi Engdahl says:
Asha McLean / ZDNet:
Amazon announces AWS Global Accelerator to boost performance across regions and AWS Transit Gateway, a tool to simplify network architecture and reduce overhead — The AWS Global Accelerator is expected to boost performance of global workloads and the AWS Transit Gateway is aimed at simplifying network architecture.
AWS Global Accelerator to boost performance across regions
https://www.zdnet.com/article/aws-global-accelerator-to-boost-performance-across-regions/
The AWS Global Accelerator is expected to boost performance of global workloads and the AWS Transit Gateway is aimed at simplifying network architecture.
The AWS Global Accelerator has been touted by VP of global infrastructure at AWS Peter DeSantis as improving availability and performance for AWS customers’ end users.
Essentially, user traffic enters AWS Global Accelerator through the closest edge location. The accelerator then routes the user traffic to the closest healthy application endpoint within the global AWS network. Lastly, at the endpoint, the application response returns over the AWS global network and reaches the user through the optimal endpoint.
Users are directed to an AWS customers’ workload based on their geographic location, application health, and weights that the AWS customer can configure.
The new service also allocates static Anycast IP addresses.
In AWS Global Accelerator, customers are charged for each accelerator that is deployed and the amount of traffic in the dominant direction that flows through the accelerator. The company expects customers will typically set up one accelerator for each application, but more complex applications may require more than one.
Users will be charged a fixed hourly fee — $0.025 — for every accelerator that is running, over the standard Data Transfer rates.
Tomi Engdahl says:
Mary Jo Foley / ZDNet:
Microsoft details three independent root causes behind last week’s worldwide MFA outage that affected Azure, Office 365, Dynamics, and other Microsoft users — Microsoft has posted a root cause analysis of the multifactor authentication issue which hit a number of its customers worldwide last week.
Microsoft details the causes of its recent multi-factor authentication meltdown
https://www.zdnet.com/article/microsoft-details-the-causes-of-its-recent-multi-factor-authentication-meltdown/
Microsoft has posted a root cause analysis of the multifactor authentication issue which hit a number of its customers worldwide last week. Here’s what happened.
Microsoft’s Azure team has gone public with the root cause it discovered when investigating the November 19 worldwide multi-factor-authentication outage that plagued a number of its customers.
For 14 hours on November 19, Microsoft’s Azure Active Directory Multi-Factor Authentication (MFA) services were down for many. Because Office 365 and Dynamics users authenticate via this service, they also were affected.
The first root cause showed up as a latency issue in the MFA front-end’s communication to its cache services. The second was a race condition in processing responses from the MFA back-end server. These two causes were introduced in a code update rollout which began in some datacenters on Tuesday November 13 and completed in all datacenters by Friday November 16, Microsoft officials said.
A third identified root cause, which was triggered by the second, resulted in the MFA back-end being unable to process any further requests from the front-end, even though it seemed to be working fine based on Microsoft’s monitoring.
Tomi Engdahl says:
Brian Heater / TechCrunch:
Amazon debuts RoboMaker, a cloud-based service to let coders develop, test, and deploy their robotics applications using the open source Robot Operating System
Amazon launches a cloud-based robotics testing platform
https://techcrunch.com/2018/11/26/amazon-launches-a-cloud-based-robotics-testing-platform/
Amazon’s kicking off Re:Invent week with the launch of AWS RoboMaker. The cloud-based service utilizes the widely deployed open-source software Robot Operating System (ROS) to offer developers a place to develop and test robotics applications.
RoboMaker essentially serves as a platform to help speed up the time-consuming robotics development process. Among the tools offered by the service are Amazon’s machine learning technologies and analytics that help create a simulation for real-world robotics development.
Amazon Web Services Announces AWS RoboMaker
https://www.businesswire.com/news/home/20181126005271/en/Amazon-Web-Services-Announces-AWS-RoboMaker
New service enables developers to quickly and easily build intelligent robotics applications
Tomi Engdahl says:
Why AWS re:Invent is arguably more important than Amazon’s Black Friday, Cyber Monday bonanza
https://www.zdnet.com/article/why-aws-reinvent-is-arguably-more-important-than-amazons-black-friday-cyber-monday-bonanza/
Yes, everyone shopped their wallets dry on Amazon during the big holiday sales push. But don’t forget that the AWS public cloud side of Amazon is likely to drive valuation and operating income going forward
Tomi Engdahl says:
A new way to think about security in AWS
https://blogs.cisco.com/security/a-new-way-to-think-about-security-in-aws
Amazon Web Services (AWS) provides numerous benefits to customers, allowing companies to be more responsive, available, and cost-efficient. It also provides a number of security capabilities, including strong identity and access management, granular activity logs, and strong policy enforcement.
However, that doesn’t mean you shouldn’t worry about security in your AWS environment. Simple speaking, AWS provides enough flexibility for you to shoot yourself in the foot if you aren’t careful. Gartner estimates that through 2022, at least 95 percent of cloud security failures will be the customer’s fault. Of course, AWS invented the now-famous shared responsibility model to educate customers on these risks and their role in protecting their workloads.
Tomi Engdahl says:
AWS Transit Gateway helps customers understand their entire network
https://techcrunch.com/2018/11/26/aws-transit-gateways-help-customers-understand-their-entire-network/?sr_share=facebook&utm_source=tcfbpage
Tonight at AWS re:Invent, the company announced a new tool called AWS Transit Gateway designed to help build a network topology inside of AWS that lets you share resources across accounts and bring together on premises and cloud resources in a single network topology.
Amazon already has a popular product called Amazon Virtual Private Cloud (VPC), which helps customers build private instances of their applications. The Transit Gateway is designed to help build connections between VPCs, which, up until now, has been tricky to do.
Tomi Engdahl says:
AWS launches a base station for satellites as a service
https://techcrunch.com/2018/11/27/aws-launches-a-base-station-for-satellites-as-a-service/?utm_source=tcfbpage&sr_share=facebook
Tomi Engdahl says:
Talking to Satellites Just Got a Lot Easier
…because Amazon has just launched AWS Ground Station
https://blog.hackster.io/talking-to-satellites-just-got-a-lot-easier-6e86e78d73ea
During Andy Jassy’s keynote, they announced a service called AWS Ground Station, which allows you to talk to satellites on a pay-as-you-go, basis.
Created almost twenty years ago the CubeSat standard has lowered the barrier to entry
So far more than 700 CubeSats have made their way to orbit
All those satellites in orbit means that there needs to be a corresponding build out of ground station capability, and yesterday’s announcement—which was made in partnership with Lockheed Martin and will be part of their “Verge” ground station network
Tomi Engdahl says:
Google Makes Secure LDAP Generally Available
https://www.securityweek.com/google-makes-secure-ldap-generally-available
Google this week announced the general availability of secure LDAP, after introducing the capability in October at Next ’18 London.
Allowing customers to manage access to traditional LDAP-based apps and IT infrastructure, it can be used with either G Suite or Cloud Identity, Google’s managed identity and access management (IAM) platform.
Secure LDAP, the Internet search giant explains, supports management of access to both software-as-a-service (SaaS) apps and traditional LDAP-based apps/infrastructure, regardless of whether on-premises or in the cloud, via a single IAM platform.
Secure LDAP enables authentication, authorization, and user/group lookups and, because the same user directory is used for both SaaS and LDAP apps, logging into services like G Suite and other SaaS apps is similar to that for traditional applications.
https://cloud.google.com/blog/products/identity-security/cloud-identity-now-provides-access-to-traditional-apps-with-secure-ldap
Tomi Engdahl says:
AWS Security Hub Aggregates Alerts From Third-Party Tools
https://www.securityweek.com/aws-security-hub-aggregates-alerts-third-party-tools
Amazon Web Services on Wednesday announced the launch of AWS Security Hub, a service designed to aggregate and prioritize alerts from AWS and third-party security tools.
Unveiled at the AWS re:Invent 2018 conference, AWS Security Hub provides organizations a comprehensive view of their security status by consuming, aggregating, organizing and prioritizing data from Amazon GuardDuty, Amazon Inspector, Amazon Macie, and tools from AWS partners.
A significant number of cybersecurity firms announced on Wednesday that their products can be integrated with the AWS Security Hub, including CrowdStrike, Twistlock, Tenable, Armor, McAfee, Splunk, Check Point, Palo Alto Networks, Alert Logic, Qualys, Sophos, Trend Micro, Sumo Logic and Fortinet. Each of these companies issued statements, press releases and blog posts regarding the partnership with AWS.
Tomi Engdahl says:
Amazon Targets Hybrid-Cloud Customers With On-Site Servers
https://www.bloomberg.com/news/articles/2018-11-28/amazon-announces-cheap-data-storage-windows-compatible-services
Amazon.com Inc. will let customers put servers used in the company’s cloud-computing data centers into their own facilities, an effort to reach businesses that want to store some of their technology functions in the cloud while keeping tighter control of others.
The move announced Wednesday by Amazon Web Services Chief Executive Officer Andy Jassy helps provide hybrid-cloud strategies desired by many larger enterprise customers in an area where cloud-competitor Microsoft Corp. is making headway.
Tomi Engdahl says:
AWS announces new Inferentia machine learning chip
https://techcrunch.com/2018/11/28/aws-announces-new-inferentia-machine-learning-chip/
Tomi Engdahl says:
Amazon gets into the blockchain with Quantum Ledger Database & Managed Blockchain
https://techcrunch.com/2018/11/28/amazon-gets-into-the-blockchain-with-quantum-ledger-database-managed-blockchain/