Search · Best Podcasts
Learning Cloud
By Learning Cloud
About this podcast
Learn about AWS Cloud Computing and listen to interviews with experts from these fields. Our passion is to help you stay motivated to learn and master these technologies.
Podcasts like "Learning Cloud"   · View all
AWS re:Invent 2015
By Amazon Web Services
AWS Podcast
By Amazon Web Services
AWS TechChat
By AWS TechChat
DevOps on AWS Radio
By Stelligent Systems LLC
AWS Faculty Mentoring Faculty
By Andy Jones
Episodes (Total: 10)
Oct. 25, 2017 · 00:08:07
Are you a slacker,  have you been slacking recently?  If you not you should be slacking more often. This is just a little play on words because today obviously we’re talking about Slack.  And of course I don’t mean being lazy or putting things off but I mean the Social media tool Slack. What is Slack? Slack is a cloud-based team collaboration tool, In simpler terms, it is a chatting program;  think of  Skype, Twitter. Slack was created by Daniel Stewart Butterfield.  He’s a Canadian entrepreneur, best known for being a co-founder of the photo sharing website Flickr.  By the way, he’s worth 1.69billion. How does Slack work?  Basically you can create a private group so that you can chat with only members of that group.  Also there’s no charge to using Slack. Slack differs to Twitter in that generally your tweets are visible by everyone, unless it’s a personal Tweet.  Slack focuses on the private interaction and allows you to embed pictures and video into your message. Slack uses the following terms to create groups. Team and Workspaces Channels Messages In this episode I want to introduce you to the Qwiklab where you will build a chat bot for Slack. Chat bots have the ability to interact with teams and users, respond to commands, and post notifications, giving all conversation participants visibility into team activities. You will build a bot that posts CloudWatch alarms to your Slack channel.  The high level steps are Create Slack account Create Webhook in slack Create a lambda function. select the Blueprint for slack enter the web hook Create the python code test it Create cloud watch alarm You can find the detail steps in this Link.  You’ll have to sign up to be able to see the instructions https://qwiklabs.com/focuses/3444  Remember “If you’re not willing to learn no one can help you but if you’re determined to learn no one can stop you” Bye bye for now, until next time.   -Raf
Oct. 11, 2017 · 00:10:29
“AWS Lambda is a compute service that runs your code in response to events and automatically manages the compute resources for you making it easy to build applications that respond quickly to new information.” Gartner presented at work and one of the things they said is how Serverless would be the future of Cloud. Potential of Lambda: Can do any task related to your infrastructure or App configuration Launch instances Load data to a DB Read and process messages from a queuing system The beauty is that you only have to write the code. Netflix explains how the company can use event-based triggers to automate the encoding process of media files, validate backup completions instance deployments at scale Main Purpose: The one use case i’ll be showing you today is using Lambda to manipulate images, in particular how to create thumbnails. The process to upload images to S3 is convenient. Downloading the same and using them on the website isn’t as simple. We need thumbnails (multiple sizes, mind you!) and probably a standard size in which we would like to showcase the uploaded images. I’ll be referencing a Qwiklab Steps Login to account and select Lambda service Create a Lambda function   Blueprint or “Author from Scratch”    Select Language: Languages supported by Lambda: Python, node.js, C# Java Everything I’ve read suggests that Python provides fastest performance Runs code in milliseconds of an eventPaste Code: Make up of lambda function: Library definition boto3 is the AWS SDK for python Functions Handler Connect to S3 and create a bucket Upload an image to the bucket. this will trigger lambda function https://qwiklabs.com/focuses/2966 http://karlcode.owtelse.com/blog/2017/03/15/python-is-faster-than-node-dot-js/ https://aws.amazon.com/solutions/case-studies/netflix-and-aws-lambda/ https://aws.amazon.com/solutions/case-studies/netflix-and-aws-lambda/
Sept. 5, 2017 · 00:07:35
  Hi there and welcome to the 14th episode of the Learning Cloud podcast. The place for all things Cloud and DevOps.  My name is Raf Calderon I’m excited to be back Podcasting.  It’s been a while since I last recorded.  I needed time to clear my mind and organize my thoughts.  But hey I’m back and I’m glad and it feels good.  It feels like riding a bike.  You don’t forget it. Today I want to talk about Logs, in particular how to manage Logs in the cloud. Intro Working with Logs is a challenging task. Computing environments generate massive amount of log files that are filled with all kinds of information: System Logs, application logs, security logs.  These files are being generated everyday, all year round.  Resulting in  multiple GBs of unstructured plain text data. In most cases, manually going through plain log files, grepping for particular strings is daunting. So we turn to professional applications to help us in this task. There are products that are the Leaders in this market. They are Splunk and Elk   Splunk – An Enterprise product sold and supported by Splunk Inc.  Targets large enterprise Market. – Used a lot by security groups for long term archival.  To do forensics. – It’s costly   Elk  – An Opensource product.  ELK is an acronym for ElasticSearch, Logstash and Kibana.    – You can buy support from Elastic.  – Since it’s ELK is an opensource it’s used by different companies. Mostly small but also large  – I’ve seen it used in production to track app metrics in a more dynamic way   Transition: Now I want to switch my focus to the cloud.  The loggin solution in AWS is called Cloud Watch. Amazon Cloud Watch Logsis a managed service that enables you to store your EC2 logs in the cloud. How does it Work Agent AWS Console Some features -Archiving/retention -Metrics: see failed login attempts in system log -Alarms   Demo you should try If your’re interested in getting some hands on practice with CloudWatch logs this is easy.  This is why I love working with AWS Launch an EC2 instance and send logs to it Launch an instance, with appropriate role Install the agent Configure the agent: Region Log file Log group name Stream name Watch the logs   I’ll put the link to the cloud watch Logs user guide where you can get more information. Closing That’s it for today. Please leave a comment on the post if you have enjoyed it., I love to hear your thoughts.  That’s it for today and remember “If you’re not willing to learn no one can help you, if you’re determined to learn no one can stop you.”   Bye bye for now, and till next time. Links http://blog.takipi.com/splunk-vs-elk-the-log-management-tools-decision-making-guide/ http://docs.aws.amazon.com/AmazonCloudWatch/latest/logs/cwl-ug.pdf  
Dec. 5, 2016 · 00:06:03
Welcome to the 13th episode of the Learning Cloud podcast. The place for all things Cloud and DevOps.  My name is Raf Calderon and I’LL be your host. In today’s show I want to talk about how to analyze your AWS account balance. Amazing service.  All the things you can do. Start a Compute instance Deploy resources in different continents Deploy your own WordPress website. but this is not free. But it’s not free. Your AWS account is linked to your credit card. So you need to keep watch of your what resources you consume, especially if your account is a personal account for learning purpose. In AWS you’re paying for infrastructure resources as an utility so you have to watch how much you’re consuming.   Console -> Account Name -> My billing Dashboard Here you can see your monthly balance It displays a break down per resource. It doesn’t show you what the break down per day.   My billing Dashboard -> Cost Explorer select Monthly cost by service Select ‘Day’ to view daily balance.  This will show you the service Specify the Time Range: Last 7 days. This gives you a very detail view.  Even if you’re paying 25c per day this will add up.   My EC2 charge was high but my instance was stopped.  but why cause my EC2 instance was stopped.  Ahh, I found a running instance in another AZ Found RDS Snapshot  
Nov. 29, 2016 · 00:06:22
Welcome to the 12th episode of the Learning Cloud podcast. The place for all things Cloud and DevOps.  My name is Raf Calderon and Il’ be your host. In today’s show I want to talk about how to identify all the resources you have configured in your AWS account.   Why this is important? Helps in tracking your costs. AWS offers offers you PAAS .  You consume resources as an utility.  Instead of paying for one big annual hosting contract, you pay piecemeal for individual resources. You pay for the size of the resource, and the number of resources. Why is it challenging to do?  You only see the resources within a region. We’ll do this using the tag editor function. What are tags?  Tags are metadata on a resource.  Tags enable you to categorize your AWS resources in different ways, for example, by purpose, owner, or environment. This is useful when you have many resources of the same type — you can quickly identify a specific resource based on the tags you’ve assigned to it.  Each tag consists of a key and an optional value both of which you define. Process Open tag editor: Console -> Resource Groups -> Tag editor Filter regions: select all Filter resource types: select all Find resources Sort by Resource Type:   Remember to sign up for the live stream of AWS Reinvent.  The keynote event is on Tuesday Nov 29. That’s it for today.  As always remember that “If you’re not willing to learn no one can help you, but if you’re determined to learn no one can stop you“.   -Raf
Nov. 15, 2016 · 00:12:20
Welcome to the 11th episode of the Learning Cloud podcast. The place for all things Cloud and DevOps.  My name is Raf Calderon and Il’ be your host.   In today’s show I want to talk about installing the AWS CLI on macOS.  This is a simple task but it touches on a number of configurations that give you a good starting point as you start using AWS. First I’d like to remind you that AWS re:Invent 2016 is Sun Nov 27 through Friday Dec 2.  EVENT is Sold OUT Sign up for Live Stream and you’ll be able to watch the keynotes with Andy Jassy and Werner Vogels.   The prerequisites to complete this task we’re discussing are An AWS Account A Mac system   There are three ways that you can access AWS: Console Via SSH using key pairs Programatically using access keys.   Reason for doing this: You can do a lot of stuff in the GUI but to start using AWS professionally you’ll want to use various tools to automate the tasks you perform via the Console. Some of these tools are: CloudFormation, Opsworks, Terraform and Packer.   In my case I was working with Packer to create a customized AMI.  The idea is that I would set up Packer on my system at home define and run configuration.  And then packer would login to AWS and run all the commands to start a new instance, do some work on it and then create a new AMI based on that image. Today we’re just focusing on setting up the access to AWS using the CLI, which Packer will use later. I have written a detailed Article showing how to do this.  You can find the link on this show notes.   Remember to sign up to re-invent Live Stream.   Thanks for listening and remember “If you’re not willing to learn no one can help you but if you’re determined to learn no one can stop you”.   Till next time.   -Raf
Oct. 30, 2016 · 00:30:01
Welcome to the 10th episode of the Learning Cloud podcast. The place for all things Cloud and DevOps.  My name is Raf Calderon and Il’ be your host. Today we’re back with an exciting interview.  Our guest is a Sr. Dev Ops engineer.   Some of the things we discussed are: What’s the difference between a DevOps engineer and a Developer His Journey from a junior developer to a Sr. DevOps Engineer His side projects what technologies he is playing with   I’m sure you’ll enjoy this conversation, so without any further ado here the interview with Richard Whitney.   I hope you have enjoyed this interview with Richard Whitney I know I learned a lot and I look forward to having him back on the show in the future.   That is it for today and remember “if you’re not willing to learn no one can help you but if you’re determined to learn no one can stop you”.         Links: https://github.com/rb1whitney/rbwexample/blob/master/Vagrantfile http://docs.the-startup-stack.com/databases/introduction/
Oct. 17, 2016 · 00:11:59
Welcome to the 9th episode of the Learning Cloud podcast. The place for all things Cloud and DevOps.  My name is Raf Calderon and I’ll be your host. Today I want to spend some time to talk about installing Chef in the cloud.  This text is only the outline.  You need to listen to the Podcast for the full content.   Built Chef Server on AWS last night What is Chef? It’s an automation tool, an orchestrator. It lets you define a configuration for a server and then it will keep it there. We use Chef at work to automate our cloud builds. Installing my own Chef server is something I should’ve done a  while back. With any server software solution learning starts by installing it yourself.  Watching a video doesn’t cut it. I’ve a visual learner so I had Pluralsight video as re-inforcement I had the option to spin up an instance totally configured with Chef pre-installed but there’s not as much learning with that. Built the environment Requirement: Chef server, Chef workstation and chef client This is where AWS really shined.  In past I wouldn’t get past this stage because of the Prep time required. I had the option to select different OS.  It was so easy to do in AWS I felt so empowered having knowledge of AWS I had the account I Knew how to work environment. I had my own VPC. My instances would get public IPs. I knew how to make sure the port access was there. I knew how I’d be able to connect to my instances with SSH keys.  I knew this was a test environment that I could destroy at any second. It’s all external to my system, not VBOX. Better yet this would all be free Error regarding memory. Put elasticity to practice. I needed to add more memory to my instance. Kicked off the installer and saw error. This put me in troubleshooting mode I looked at the reqs and they wanted 8g I stopped instance and resized it from t2.micro to t2.medium Instance was running over night and got no charges. I used t2.medium.  Free tier is t2.micro It’s amazing that I was able to do all that for free.   You can see the step by step instructions here I hope you have enjoyed today’s episode please give us a rating in iTunes and leave a comment in our website learningcloud.tech. Remember “If you’re not willing to learn no one can help you, but if you’re determined to learn no one can stop you” Bye bye for now and until next time. -Raf
Oct. 2, 2016 · 00:42:51
Welcome to the 8th episode of the Learning Cloud podcast. The place for all things Cloud and DevOps.  My name is Raf Calderon and I’ll be your host.   We have a great show for you today. Second guest joining us. Funny I say second guest.  That’s because the show is young I’m sure that eventually I won’t be numbering our guests. As a brief introduction let me just say that our guest is a personal friend of mine and you’ll hear another bit about how we know each other. I’ll leave that as a surprise for now. He is a seasoned veteran in IT and an expert in DevOps and Cloud computing. And an all around nice guy.  So I’m please to introduce you to Eric VanderPoel from BarclayCardUS.     I hope you have enjoyed this interview with Eric VanderPoel.  As a disclaimer, let me just add that while Barclaycard is a great company to do business with and to work for, the views expressed in this show are mine own and my guests.  This show doesn’t have any legal association with BarclayCard. Lastly if you have enjoyed the show please give us a rating in iTunes and leave a comment in our website; learningcloud.tech. Remember “If you’re not willing to learn no one can help you, but if you’re determined to learn no one can stop you”   Bye bye for now and until next time.   -Raf  
Sept. 25, 2016 · 00:09:25
Welcome to the Seventh episode of the Learning Cloud podcast. The place for all things Cloud and DevOps.  My name is Raf Calderon and I’ll be your host. I’m really pleased at how the Podcast is unfolding and how it helps to keep me focus on my pursuit in learning cloud computing.  Encouraging the community to pursue continue education. Review last week guest from Spreaker. – Lock in spot instance by putting out a High bid price – They run their own DBs on EC2 instances – Adopting Containers & Kubernetes Update I passed the test on my first try -So it’s one down and four to go. -I would’ve liked to to score higher but I was really interested in getting this first test behind me. -So though not high I did pass and it’s not like I’m done studying.  I’ll continue to fill knowledge gaps as I prepare for the next tests. -BTW, next up is the Certified Developer Assoc. AWS Blog: Enterprise support update Training credits: 500 Quicklab credits. plus 30% off additional credits Cloud Operations Reviews review of your cloud operations and the associated management practices Well architected review This review allows our customers to measure their architecture against AWS best practices.      
About Listen Notes
Podcast search engine with 403,946 podcasts and 23,599,569 episodes. Built by a one-person team. Learn more.
Follow us
Monthly updates via email (past issues)