It’s 4 O’clock of a rainy Friday morning and one more Google Summer of Code has ended. My second to be exact. Time for me to hang in my boots, and start writing another report. Probably my last on this subject matter. There’s a lot to write.(more…)
Well, as far as the flow goes CerberusValidator works with schemas that are in Mapping structure. Basically any dicts with values as dict having types of values. If you don’t get it, then check this out https://docs.python-cerberus.org/en/stable/
But, Cerberus only cares for the schema and data which it’s getting from the user. Not from where it gets it. Almost most of our users will be giving the schema in the form either URL or paths to files. Which is fine by us until the point somewhere in week 12 where I forgot to code that properly into the code. Nothing to be afraid had to redo some old functions. Actually improved a lot of old code in the process. How time flies by. Damn.
Not much is left to be done, except write a few more tests and a lot of testing. And merging it to master. I am confident we can make it before August 19. Let’s see. Fingers crossed. This is vipulgupta2048 signing off for the second last time here. I won’t be going anywhere if you think.
There is a lot of work to be done at ScrapingHub x The Scrapy Project.
Looking forward to new challenges.
Next, final work report!
Week #10 24/07 to 30/07
The integration finally worked, and a completely unrelated banner!!(more…)
Week #9 17/07 to 23/07
Well, integration isn’t working, and I am not giving up.
Week #8 10/07 to 16/07
I just realized that there aren’t many weeks left. Good times like these should never end.
Week #7 03/07 to 09/07
Week #6 26/06 to 02/07
Well, I survived the first evaluation as you can all see. Made some mistakes along the way, recovered with the advice from my mentors and hopefully going strong into work period 2. Let’s talk shop, yes.
Week #5 19/06 to 25/06
The first evaluation is here, got done with a milestone and took a small break for a personal event.
Week #4 12/06 to 18/06
Well, this has been another rather testing week.
Week #3 5/06 to 11/06
If the distance is the path traveled between 2 points and displacement between 2 points is the shortest path you can take to reach your destination from the initial point.
Then, I say after making full circles, my overall displacement is 0. But, I am sure as hell has come a long way in learning more about Python as a programming language by just reading and understanding and implementing new code concepts than ever before.
Week #2 – 28/05 to 04/06
Well, this has been a good week of learning about new things, revising old concepts and reading implementation of one of the oldest modules in Python to understand the idea behind Python Packaging. I feel bad about not able to write a lot of code, but I think without understanding the existing code base the way forward would have been fruitless, and more disappointing.
So, let’s start by answering our 3 infamous questions and later give you a broad picture into Python Packaging as I will try to explain it to you like a 5-year-old.
Week #1 – 21/05 to 27/05
In the last meeting, my mentors and I decided upon the mini-project that I suggested. Here’s a brief overview of what I decided to work with over the course of the last week.
I thought remote work would be easy. I thought it will be comfortable, creative and carefree. I thought I would be so efficient that I be in the zone all the time. Only if I knew how wrong I was. Only if I looked to see beyond the perks of working remote to see the dedication and discipline needed. And the best part that I love the concept even more.Click here for reading more
Someone truly great once quoted that “if you can’t explain it simply. you don’t understand it well enough” That’s the way of life, isn’t it? Hence, I would like to take a shot in explaining how Scrapy‘s bigger brother, Spidermon validates the data scraped by spiders, how the gears turn and in the process, I would learn as well. Let’s jump in.Let’s DIG IN!