How To Build A Web App, part 18 of ?: jsonapi-resources tests my patience
This is the eighteenth in a series of articles taking you through all the actual steps in building a web app. If you’re an aspiring developer, if mucking around with teensy beginner tutorials frustrates you, if you’d love to build a properly substantial app that does fab things, these articles are for you.
Word to the wise: these tutorials don’t depict a polished, pristine workflow. You won’t find “follow directions A, B, C” taking you in a perfectly straight line. Nah. These tutorials depict mess and chaos and grit. They meander. They show what real web-dev is actually like. It’s a standard trait of professionals that you should make your trade look easy. Not here, baby. If I’ve written these articles properly, beginner-to-intermediate devs will read them and think “Oh thank Christ, turns out these self-proclaimed Senior Developers struggle just as much as me! My impostor syndrome is just a syndrome!”
Last time, we decided that a damned fine way of making our Rails/database model serialization happen is the jsonapi-resources gem. Today, let us continue our sojourn.
Postman
Let’s introduce you to another Thing. Postman.
Postman is a testing tool. It fires off HTTP requests at whatever URL you care to name. You can set request headers and get response headers. Postman is fab. Postman is darling. Check this out:
You fire HTTP requests at the URL of your choice. You can modify every last little fiddly-about-y bit you please: query params, authorization params, header values, whatever you like. Postman then receives the response and displays every last little fiddly-about-y bit in turn.
You can see here I’ve just fired off a GET http://localhost:3000/api/v1/venues?filter[acts]=1,2
. You can also see in that screenshot’s left-bar that I’ve fired off a zillion other HTTP requests at /api/v1/venues
, messing around with the value of filter[acts]
, trying to figure out how jsonapi-resources parses its query params. Postman allows me to easily submit all kinds of values: filter[acts]=1,2
, filter[acts]=1+2
, 1;2
, 1–2
, foo,bar
, on and on.
The goal is to submit an array of positive integers, you see, and jsonapi-resources interprets 1,2
as an array, i.e.['1', '2']
, but it interprets pretty much everything else as just the raw string.
I’d wanted to find out why, soooo…
…I’ve therefore concluded sinking entirely too many hours into seeking out jsonapi-resources’ exact line of internal code responsible. But I found it at last! Turns out it simply calls CSV.parse_line
on the input, which, you guessed it, transforms a comma-separated list into an array. Awesome.
Okay, that takes care of some of our validation. But, bottom line: when we submit values to filter[acts]
that aren’t positive integers, how do we want our API to respond? With the same specific error messages as before? Or just disregard and discard non-positive-integers?
You know what? Given that this API endpoint’s only customer, so far, will be our UI client app … I’m honestly just tempted to throw out all that JsonSchema validation stuff we’d written for our SearchController. It’s overkill. Just use to_i
to transform all string-digits into actual integers: the former are queried; all else are discarded. Simplifies things hugely. Easy peasy.
What might we test, then? I’m thinking something like this:
# spec/requests/api/v1/venues_spec.rbdescribe "GET index" do before {
get api_v1_venues_path, params: params
} describe "sending no act IDs at all" do
it "returns the most recent 20"
it "conforms to the json-api standard"
end describe "sending IDs 2,4,5" do
it "returns acts 1,2,10,11"
end
end
Let’s get cracking.
Wrong docs are worse than no docs
Oh, yay. Jsonapi-resources is testing my patience.
Here’s why. I’d hoped we could just knuckle down and crank out a few lovely wholesome tests. Something you could take home to your ol’ Mama without fear of embarrassment. I’d hoped we could write something like this:
it "conforms to the json-api standard" do
expect(response_json['data'][...]).to eq "[json string]"
end
Same as before, we’d be comparing the API’s response json with whatever jsonapi-resources should serialize our models to. "[json string]"
above should be a call to its serializer.
To serialize an ActiveRecord model, so it claims, you do this:
post = Post.find(1)serialized_post = JSONAPI::ResourceSeralizer.new(PostResource)
.serialize_to_hash(PostResource.new(post, nil))
Er, okay. Bit verbose, but what the hell. Let us replicate that. Into the Rails console we hop.
irb(main):001:0> venue = Venue.find(1)
Venue Load (0.8ms) SELECT "venues".* FROM "venues" WHERE "venues"."id" = $1 LIMIT $2 [["id", 1], ["LIMIT", 1]]
=> #<Venue id: 1, name: "Payne Arena", ticketmaster_id: "KovZpZAJJtIA">irb(main):002:0> serialized_venue = JSONAPI::ResourceSerializer.new(Api::V1::VenueResource).serialize_to_hash(Api::V1::VenueResource.new(venue, nil))
Traceback (most recent call last):
1: from (irb):2
NoMethodError (undefined method `serialize_to_hash' for #<JSONAPI::ResourceSerializer:0x00007fb155bc2448>
Did you mean? serialize_to_relationship_hash)
undefined method `serialize_to_hash'
? Errwhat?
No, seriously. Prolonged manual testing did in fact reveal that #serialize_to_hash
doesn’t exist within jsonapi-resources.
Oh yay, another bug.
**Googling**
Well. Turns out I’m not the only one who’s encountered it. I’m afraid this happens every so often. Just because a library’s owner decides to release a specific version of that library to the public, sure as sunrise doesn’t mean it’s in a properly usable state.
How do we solve that?
Two options leap out at me: (1) swap out the exact version of jsonapi-resources we’re using with another that does contain it; or (2) just abandon jsonapi-resources altogether.
I can’t deny that (2) has a certain appeal … but let’s not spurn this bloody library just yet. Let’s attempt (1).
How to manually set a gem’s version
First, get the complete list of all versions available. Every publicly available Ruby gem has its own rubygems.org page. This is jsonapi-resources’ versions page, listing all 82 of its versions, macro, micro and bugfixro.
Which one should we use? I commenced my signature faffing-around, on that Github bug report page and others, and it would appear the Gem version with a working #serialize_to_hash
is 0.9.11.
Into the Gemfile
we jump. We change
# Gemfile...gem 'jsonapi-resources'...
To
# Gemfile...gem 'jsonapi-resources', '0.9.11'...
Then update with bundle update jsonapi-resources
, and we’re done!
Gem version control via Bundler is a whole ‘nother Thing in itself. Read this for more.
Let us reattempt serialization.
irb(main):001:0> venue = Venue.find(1)
Venue Load (1.2ms) SELECT "venues".* FROM "venues" WHERE "venues"."id" = $1 LIMIT $2 [["id", 1], ["LIMIT", 1]]
=> #<Venue id: 1, name: "Payne Arena", ticketmaster_id: "KovZpZAJJtIA">irb(main):002:0> JSONAPI::ResourceSerializer.new(Api::V1::VenueResource).serialize_to_hash(Api::V1::VenueResource.new(venue, nil))
=> {:data=>{"id"=>"1", "type"=>"venues", "links"=>{"self"=>"/api/v1/venues/1"}, "attributes"=>{"name"=>"Payne Arena"}, "relationships"=>{"gigs"=>{"links"=>{"self"=>"/api/v1/venues/1/relationships/gigs", "related"=>"/api/v1/venues/1/gigs"}}}}}
Success! A valid JSON-API-serializing. That’ll do nicely! Let’s resume our testing.
By the way, I’d better point out that it’s not out of the question that bumping down our gem version might cause problems elsewhere. We’re getting quite fiddly with this gem, and it’s not inconceivable that this plate-spinning act we’re setting up here might be more trouble than it’s worth. But time will tell. Onward.
Franken-coding
Tangent time! Don’t worry, this one is much briefer.
Remember Franken-coding? We first encountered this in article 12. It refers to coders wodging together vast numbers of totally unrelated libraries. They fight each other. They get in each others’ way. What I’m about to show you is only a very minor example indeed, but I think it’s worth mentioning.
Remember that test we’d written to verify the Content-Type
HTTP response header value?
# spec/requests/search_spec.rb...describe "Querying API with zero params of any kind" do let(:params) { {} } it "returns the jsonapi Content-Type" do
expect(response.headers['Content-Type'])
.to eq 'application/vnd.api+json; charset=utf-8'
end
end
I’d not run this test since installing anything jsonapi-resources-related. Well **ahem**:
$ rspec spec/requests/search_spec.rb:38
Run options: include {:locations=>{"./spec/requests/search_spec.rb"=>[38]}}
FFailures:1) Search Querying API with zero params of any kind returns the jsonapi Content-Type
Failure/Error: expect(response.headers['Content-Type']).to eq 'application/vnd.api+json; charset=utf-8' expected: "application/vnd.api+json; charset=utf-8"
got: "application/vnd.api+json"(compared using ==)
# ./spec/requests/search_spec.rb:39:in `block (3 levels) in <top (required)>'Finished in 0.7712 seconds (files took 3.95 seconds to load)
1 example, 1 failureFailed examples:rspec ./spec/requests/search_spec.rb:38 # Search Querying API with zero params of any kind returns the jsonapi Content-Type
What happened? Turns out jsonapi-resources mucks about with our HTTP response headers. But that’s okay! It’s supposed to. It’s a fully fledged drop-in to JSON-API-ify Rails, remember? Its entire point is that you add it to your Gemfile
and it’ll handle everything from there. Naturally it’ll automatically do certain things we’d already set up manually ourselves, right?
But as we can see, it doesn’t quite do it identically. That charset=utf-8
bit has vanished. But eh, it’s not the end of the world, honestly. The important bit was the application/vnd.api+json
bit. That initializer file register_json_mime_types.rb is no longer necessary; we can delete it.
Let’s commit all that: installing our specific version of jsonapi-resources; losing our register_json_mime_types.rb file; modifying this test; adding resource files; other fripperies. Commit and diff, sweet.
Writing generalised JSON-API tests
Okay, where were we? We were writing tests governing our API’s general response behaviour, that’s what.
So there’s a Thing in RSpec. Shared examples. RSpec being RSpec, it just has to implement its own way of doing things. Shared Examples is the RSpec-flavoured way of bog-standard function-calling:
shared_examples_for 'our little demo example' do |param1, param2| it 'does Thing One' do
expect(param1).to eq 'bar'
end it 'does Thing Two' do
expect(param2).to eq 'baz'
end
endRSpec.describe 'one' do
include_examples 'our little demo example', 'param1a', 'param2a'
endRSpec.describe 'two' do
include_examples 'our little demo example', 'param1b', 'param2b'
end
This runs four tests.
I mention this because I’d rather like to write a set of shared tests to be run on every single API response test. Our Content-Type
header test, for example. It’s important that this header is set for every single request and response. Makes sense to test that, right?
And a few other things: every single API response will return some kind of jsonapi-schema object. We can test that too. And include these shared examples later in every single other test. Faster.
More faffing later…
I have refactored our test block thusly:
# spec/requests/search_spec.rb...shared_examples_for 'general jsonapi behaviour for' do |klass|
it 'returns the jsonapi Content-Type' do
expect(response.headers['Content-Type']).to eq 'application/vnd.api+json'
end it 'returns an array of jsonapi objects' do
expect(response_json['data']).to match_jsonapi_array_schema
end it 'returns all with type=="klass name"' do
expect(response_json['data']).to match_jsonapi_array_types_for klass
end
end...describe 'Querying API with zero params of any kind' do
let(:params) { {} }
include_examples 'general jsonapi behaviour for', Venue
end
Two things may jump out at you here. First, what’s klass
? Second, what are these match_*_*
… matchers?
klass
first. Easy. The variable we’re kicking around here is an actual Ruby class. An ActiveRecord class. But Ruby, like any language, has reserved keywords, and class
is one of them. So klass
just means “this variable is a class but ‘class’ is naughty so we’re using ‘klass’ instead.”
Now, these matchers. I have created another spec helper file:
# spec/helpers/jsonapi_helpers.rbjsonapi_object_schema = {
type: 'object',
required: ['id', 'type', 'attributes', 'relationships'],
properties: {
id: { type: 'string' },
type: { type: 'string' },
attributes: {
type: 'object',
},
relationships: {
type: 'object',
},
links: {
type: 'object',
},
related: {
type: 'object',
},
}
}jsonapi_array_schema = {
type: 'array',
items: jsonapi_object_schema
}RSpec::Matchers.define :match_jsonapi_array_types_for do |klass|
match do |candidate_array|
candidate_array.all? {|item| item['type'] == klass.model_name.plural }
end
endRSpec::Matchers.define :match_jsonapi_array_schema do
match do |candidate_jsonapi_array|
JSON::Validator.fully_validate(jsonapi_array_schema, candidate_jsonapi_array).length == 0
end
end
This is an incredibly simple and basic helper file I’d thrown together. The matchers inside are usable in every spec file, accessible everywhere.
The only concept we’ve not encountered already is RSpec::Matchers.define
. It’s an RSpec custom matcher. All it does is fire up a JSON validator, and ensure the test array conforms to the JSON API standard. Or at least the teeny-tiny bit of it that jsonapi_object_schema
defines. It just covers a few basics, and I’ll probably add to this as necessary.
Okay! Let’s give these tests a whirl.
$ rspec spec/requests/search_spec.rb:48
Run options: include {:locations=>{"./spec/requests/search_spec.rb"=>[48]}}
...Finished in 1.14 seconds (files took 3.02 seconds to load)
3 examples, 0 failures
Okay, what’s next? Completing the refactoring of our Act ID tests, that’s what. Onward!
The Laconic Test
Ever heard of something called “The Laconic Phrase”? Long story short, it’s the art of saying much, with few words. Read that link for more.
Let us embrace this philosophy. You’ll recall that our first Act-ID-based test tested returning every single Venue:
# spec/requests/search_spec.rb...context 'Not supplying any Act IDs' dolet(:params) {}
describe 'Returning every single Venue' do
subject { response_json }
it { is_expected.to include(
*[venue1, venue2, venue3, venue4, venue5].map { |venue|
venue.as_json(include: :gigs)
}
})
end
end
I have become dissatisfied with this kind of test. It’s sloppy! You’ll recall that, back in Article 15, when we’d kicked off with adding timestamps to our Gigs, (haven’t forgotten about that, by the way, quite a bit of its testing code is on my machine right now as I write this, just haven’t committed it yet), we halted midway through when I discovered my API was returning 13 Venues instead of 5.
This test right here doesn’t pick that up. It’s sloppy. Instead, I’d much rather favour something like this:
# spec/requests/search_spec.rb...let!(:venue1) { create :venue, updated_at: Time.now - 9.days }
let!(:venue2) { create :venue, updated_at: Time.now - 3.days }
let!(:venue3) { create :venue, updated_at: Time.now - 7.days }
let!(:venue4) { create :venue, updated_at: Time.now - 2.days }
let!(:venue5) { create :venue, updated_at: Time.now - 8.days }...context 'Not supplying any act IDs' do let(:params) {
{ filter: { acts: '' } }
} it 'returns every venue, ordered by updated_at desc' do
expect(response_json['data']).to eq(
[venue4, venue2, venue3, venue5, venue1].map { |v|
JSONAPI::ResourceSerializer.new(Api::V1::VenueResource)
.serialize_to_hash(Api::V1::VenueResource.new(v, nil))
}
)
end
end
This will still need a ton of polishing: #serialize_to_hash
takes all kinds of options and params and fiddly-about-y bits to help you customise attributes, relationships, links, etc.
But, written properly, this will test that our API does indeed return exactly five Venues; ordered by updated_at descending; with all their associated Gigs attached to each one.
All in the one test. Laconic! Let’s get cracking.
t.timestamps
I wrote and ran that new test. This happened:
$ bundle exec rspec spec/requests/search_spec.rb:64
Run options: include {:locations=>{"./spec/requests/search_spec.rb"=>[64]}}
FFailures:1) Search Filtering by Act Not supplying any act IDs returns every single venue
Failure/Error: let!(:venue1) { create :venue, updated_at: Time.now - 9.days }NoMethodError:
undefined method `updated_at=' for #<Venue:0x00007fbcc1248cd8>
Did you mean? update
# ./spec/requests/search_spec.rb:28:in `block (2 levels) in <top (required)>'Finished in 0.6644 seconds (files took 3.84 seconds to load)
1 example, 1 failureFailed examples:rspec ./spec/requests/search_spec.rb:64 # Search Filtering by Act Not supplying any act IDs returns every single venue
The hell?
I’m trying to set each Venue’s #updated_at
attribute, but it’s not letting me! Did I do something wrong? ActiveRecord has always been configured such that if you create a database table column, you can read and write that table model’s attributes automatically, no further config necessary. Had Rails changed this in its latest ActiveRecord versions?
A few minutes’ messing about revealed the answer. I’d simply forgotten to add default timestamps to my models.
Remember timestamps? I embarked on a general definition of timestamps back in article 15, when I rashly assumed we’d then immediately kick off with adding start/end timestamps to our search API. Alas.
If you’ve not yet used ActiveRecord migrations to build your tables’ columns (which we’d first done back in Article 8), the idea is, migrations contain their own built-in set of functions to construct columns, set their names, set datatypes, default values, null/non-null, all kind of things.
One of those things is timestamps. There are two columns you can whack onto any ActiveRecord-based database table: created_at
and updated_at
. They’re both datetimes. The first is populated with whatever point in time that table row was first created, and will never change in future (there’s nothing stopping you changing it manually, mind you, but that way lies madness and spaghetti). The second is updated with right now, every single time you update any of the other attributes: it records when that model was most recently updated.
But you have to add them yourself. Here’s my migration file again:
# db/migrate/20190910020025_create_acts_gigs_venues_tables.rbclass CreateActsGigsVenuesTables < ActiveRecord::Migration[5.2]
def change
create_table :acts do |t|
t.string :name
end ... end
end
It should have been this:
# db/migrate/20190910020025_create_acts_gigs_venues_tables.rbclass CreateActsGigsVenuesTables < ActiveRecord::Migration[5.2]
def change
create_table :acts do |t|
t.string :name
t.timestamps
end ... end
end
t.timestamps
handles creating created_at
/updated_at
. Or would have done, had I been in possession of the astounding and terrifying coding prowess I’d been claiming these last 18 articles.
Migration time! Time to create a new migration to add them in at last.
An entirely new migration, you say? Could we not just simply modify that existing one, and rerun it? Yes we could: there’s nothing stopping us … but it’s almost always a bad idea.
Why? Imagine you’re part of a coding team. You’re all working on the same code base, faffing with features, faffing with bugs, faffing with no particular aim or goal. You’ll periodically write a Git branch, push it to Github, submit something called a Pull Request (another Thing, read that link for more), merge your branch into master
… then pull its latest contents back into your local copy of master
. And so will everyone else. You’ll make database changes. You’ll write migrations. New migration file? Sweet, run it. That’s all. Everyone understands that.
But if you modify an existing migration? You’ve got to let the others know. Manually. And not just your teammates. Anyone who’s ever yoinked a copy of your code base in the entire universe, up ’til your migration modification, must be hunted down, and educated. Ugh.
But contain the exact same changes inside a new migration file? Easy peasy. Let’s write one.
This will do nicely:
# db/migrate/20200504095206_add_timestamps.rbclass AddTimestamps < ActiveRecord::Migration[5.2] def up
%w(gigs acts venues).each do |model|
%w(created_at updated_at).each do |column|
add_column model, column, :datetime,
null: false,
default: Time.now
end
end
end def down
%w(gigs acts venues).each do |model|
%w(created_at updated_at).each do |column|
remove_column model, column
end
end
end
end
Two things may jump out at you. First, what are these up
/down
methods? Don’t migrations use change
? Second, why does default: Time.now
exist?
Brief answers: first, every migration must be reversible. Most individual ActiveRecord migration actions have an obvious reverse-action, so you can just write a change
method and ActiveRecord will figure that out automatically. But if you’re doing something really fiddly, you can use the up
/down
methods instead, for finer manual control. That’s what I’m doing here.
Second, these timestamp columns can’t ever be empty. By design, ActiveRecord model timestamps have a NOT NULL
constraint (the constraint itself is a database/SQL thing, mind you, not a Rails thing, we’re just configuring our database from inside Rails, so we write a SQL NOT NULL
constraint as a Rails-y null: false
constraint).
But! Hundreds of rows already exist across all three tables. Create these timestamps and we’ll conjure into existence hundreds more empty cells. That’s what “null” means: it’s not a value, it’s the absence of a value. NOT NULL
is just peachy when we’re creating a new table, because ActiveRecord will populate new rows’ timestamps automatically. But if we’re adding them to existing tables with shitloads of rows, then we have to populate these existing rows also. Otherwise constraint violated. So we sidestep this by supplying a default value. It doesn’t matter gigantically what that value is, honestly … so let’s just go with Time.now
.
Any other fiddly bits need reconfiguring, app-wide, re these timestamps? Sure. Resource attributes: I don’t see any reason why not to return created_at
and updated_at
with our API response.
Oh yes and FactoryBot attributes. Those too. Let’s commit and diff and resume writing our Act ID specs.
And that, I think, is a fantastic effort for today. Next time, we’ll resume our Act ID test refactoring.