Extracting Fundamental Stock Data from EDGAR using our favorite language: Common Lisp (Part 3)
In our last adventure, detailed in Part 2, we successfully extracted the Earnings Per Share (EPS) for Apple straight from the EDGAR API. Now, let’s press on.
Let’s take a closer look at the data we gathered in our previous post. We’re dealing with a structure that unfolds into several layers of nesting, ready to be explored.
Let’s inspect our data. The :UNITS
part of the association list (or alist, as Common Lisp refers to it) contains the data we need:
Digging a bit deeper, nestled under :UNITS
, we discover the :+USD+/SHARES
key. It's here, within :VAL
, that the values we're looking for are hidden.
Venturing into :+USD+/SHARES
, we observe that the Earnings Per Share (EPS) from various reports (such as 10-K and 10-Q) are laid out across different reporting dates.
We can navigate through our data by using assoc
and cdr
as our navigational aids, methodically working our way down to the layer where our EPS data is located. Here's a sample where we grab the :FY
and :VAL
fields for only the 10-K reports using a loop
and put them in a list called eps:
;;; We store our data in a variable called "data"
(setf data (get-company-concept "AAPL" "us-gaap" "EarningsPerShareBasic"))
;;; And now we traverse the nested alist until we get to our EPS information
(setf eps (loop for entry in (cdr (assoc :+usd+/shares (cdr (assoc :units data))))
when (equal (cdr (assoc :FORM entry)) "10-K")
collect (list (cdr (assoc :FY entry)) (cdr (assoc :VAL entry)))))
Now we can do some operations with this, for example, let’s calculate the average EPS of our dataset:
(/ (apply #'+ (mapcar #'(lambda (x) (cadr x)) eps))
(length eps))
I must confess, navigating through the alist to fetch our data isn’t the most thrilling part of our journey. If you’ve got a sleeker, more elegant method up your sleeve, please let me know in the comments below.
In our next post, we’ll take our adventure a step further by plotting our data. Stay tuned, and see you in the next installment!