Smallest and probably the fastest solution(s) for the problem stated above.(more…)
Well, as far as the flow goes CerberusValidator works with schemas that are in Mapping structure. Basically any dicts with values as dict having types of values. If you don’t get it, then check this out https://docs.python-cerberus.org/en/stable/
But, Cerberus only cares for the schema and data which it’s getting from the user. Not from where it gets it. Almost most of our users will be giving the schema in the form either URL or paths to files. Which is fine by us until the point somewhere in week 12 where I forgot to code that properly into the code. Nothing to be afraid had to redo some old functions. Actually improved a lot of old code in the process. How time flies by. Damn.
Not much is left to be done, except write a few more tests and a lot of testing. And merging it to master. I am confident we can make it before August 19. Let’s see. Fingers crossed. This is vipulgupta2048 signing off for the second last time here. I won’t be going anywhere if you think.
There is a lot of work to be done at ScrapingHub x The Scrapy Project.
Looking forward to new challenges.
Next, final work report!
Week #10 24/07 to 30/07
The integration finally worked, and a completely unrelated banner!!(more…)
Week #9 17/07 to 23/07
Well, integration isn’t working, and I am not giving up.
Week #8 10/07 to 16/07
I just realized that there aren’t many weeks left. Good times like these should never end.
Week #7 03/07 to 09/07
Week #6 26/06 to 02/07
Well, I survived the first evaluation as you can all see. Made some mistakes along the way, recovered with the advice from my mentors and hopefully going strong into work period 2. Let’s talk shop, yes.
Week #5 19/06 to 25/06
The first evaluation is here, got done with a milestone and took a small break for a personal event.
Week #4 12/06 to 18/06
Well, this has been another rather testing week.
Week #3 5/06 to 11/06
If the distance is the path traveled between 2 points and displacement between 2 points is the shortest path you can take to reach your destination from the initial point.
Then, I say after making full circles, my overall displacement is 0. But, I am sure as hell has come a long way in learning more about Python as a programming language by just reading and understanding and implementing new code concepts than ever before.
Week #2 – 28/05 to 04/06
Well, this has been a good week of learning about new things, revising old concepts and reading implementation of one of the oldest modules in Python to understand the idea behind Python Packaging. I feel bad about not able to write a lot of code, but I think without understanding the existing code base the way forward would have been fruitless, and more disappointing.
So, let’s start by answering our 3 infamous questions and later give you a broad picture into Python Packaging as I will try to explain it to you like a 5-year-old.
Week #1 – 21/05 to 27/05
In the last meeting, my mentors and I decided upon the mini-project that I suggested. Here’s a brief overview of what I decided to work with over the course of the last week.
I thought remote work would be easy. I thought it will be comfortable, creative and carefree. I thought I would be so efficient that I be in the zone all the time. Only if I knew how wrong I was. Only if I looked to see beyond the perks of working remote to see the dedication and discipline needed. And the best part that I love the concept even more.Click here for reading more
Someone truly great once quoted that “if you can’t explain it simply. you don’t understand it well enough” That’s the way of life, isn’t it? Hence, I would like to take a shot in explaining how Scrapy‘s bigger brother, Spidermon validates the data scraped by spiders, how the gears turn and in the process, I would learn as well. Let’s jump in.Let’s DIG IN!
This will be quick, this week I learned about PyGitHub, that implements and interacts GitHub API v3 and GitHub Enterprise APIv3 in Python and could possibly help you with any crazy ideas that you might have with repositories on GitHub. Want to check for files on GitHub in one or 1000 repositories of an organization/individual. Done. Want to make a pull request, issues, and other administration tasks through scripts that users with access to. Done. Want to check repositories for certain parameters such as last release, members both Public and Private. Done.
Does it seem like to you sometimes when you have a lot of free time and even if you have your hands dipped, dirty, and stuck in so many other pies (projects) that would take you naturally a lot of time to eat and finish (complete), but you still want to reach out for the next big pie (New Project). That’s me in summers and oh man, the pies are piling up. FAST and FURIOUS.
In the bright sunny days of February, I walked across the Indian Institute of Technology, Delhi (IIT-D), towards Investopad, Hauz Khas to attend a meetup that I was looking forward to for a very long time, ever since I saw it on the meetup page. I was both nervous and full of excitement with the prospect of attending my first Django party organized by Pydelhi and Pyladies Delhi together. With an aim to grasp what Django is and how to truly harness its power in the field of web development.
2017 had been a busy year for me and the months of October-November were no exceptions. Here’s my experience of this year’s PyCon India and all the fun that I had volunteering, learning and attending this awesome event.