git --force-with-lease Gotcha

It’s always better to use the --force-with-lease option over --force when force pushing to a remote branch. However, it’s important to remember that --force-with-lease isn’t magic.

--force-with-lease simply checks your local remote branch (./.git/refs/remotes/origin/<branch>) against the actual remote branch. This means that the following:

git fetch
git push --force-with-lease

or if you even just git pull on a different branch (this is essentially the same git fetch) will behave the same as a --force push and nuke any changes made by another developer.

Here’s an example of a common workflow that renders --force-with-lease unable to prevent the push:

# Changes made by another developer are present but not in your local remote refs.
# You want to rebase your changes.
git checkout master
git pull # causes local refs to update
git checkout -
git rebase master
# Now this acts the same as --force and overwrites changes made by the other developer.
git push --force-with-lease


Why CORS is Useful

Cross-Origin Resource Sharing (CORS) uses HTTP headers to instruct a browser if a request is allowed to be made. The browser performs the request and checks if there are any CORS related headers. If the headers are present and the current hostname is “whitelabeled” in the server’s CORS policy, the browser will allow the request.

I didn’t fully understand the purpose of CORS as it can be easily circumevented by proxies, extensions, or just a simple script.

For example, if website a.com implements a CORS policy that only requests from a.com can fetch resources, then if website b.xyz attempts to programatically load images from a.com the server will instruct the browser to not allow these requests. However, the owner of b.xyz can create a simple proxy server that strips any related CORS headers. Then b.xyz makes a request to cors-stripper.com/a.com/cute_pupper.png and can ignore any CORS policies implement by a.com.

A browser will follow the same origin policy by default where requests made to a different domain will fail. This can be turned off via setting the mode of a request to 'no-cors' but this will severely restrict the request to prevent sending anything sensitive.

'no-cors' will restrict the request to

  • only allow GET, POST requests with simple headers [2]
  • unable to read the response if the request was made via a POST

So in order to perform a request without just simple headers (like the Authorization header) the mode must be set to 'cors' and the server must return the correct CORS related headers. This prevents malicious sites from submitting cookies on behalf of a user to another site.



preexec

I recently learned about a bash hook called preexec. I found it useful for working with the TaskWarrior CLI.

preexec() { clear }

Now whenever I execute a command, terminal will clear itself out before running the command.


Working With Large Files Using Elixir

Here’s an example of how you can use streams to read and manipulate very large files without busting your memory:

[input_file, output_file] = System.argv

File.stream!(input_file)
|> Stream.reject(fn line ->
  case line do
    "INSERT INTO `logs`" <> _ -> true
    "INSERT INTO `states_changes`" <> _ -> true
    _anything -> false
  end
end)
|> Stream.map( fn line ->
  String.trim(line) <> "\n"
end)
|> Stream.into( File.stream!(output_file) )
|> Stream.run

Let’s break this down. On line 1 we pattern match on System.argv which returns a list of arguments that were passed in. We could use elixir’s built in OptionParser but we don’t need it for our purposes. It’s usage will look like:

elixir trim_large.exs input_file.sql output_file.sql

Next we grab a stream from a file using File.stream!/3 and we use Stream.reject/2 to remove entire lines that we don’t want. We then remove any leading or trailing whitespace with Stream.map/2 and String.trim/2.

We use Stream.into/2 to convert our working stream into an output file. And finally we use Stream.run to “execute” the stream. This is similar to calling Enum.to_list to return the enum but since it’s a file we use a slightly different mechanism.


Elixir Streams

Elixir streams are essentially lazy-loaded Enums. You can do a lot of neat things with them.

Calling map an an Enum will execute that call in place, whereas a stream is composable. For example, you can chain a couple of maps, and selects together without actually executing anything until you want to. Here’s an example:

a = 1..5

IO.puts "Eager enumeration:"

a
|> Enum.map(fn x -> IO.puts("x") end)
|> Enum.map(fn x -> IO.puts("o") end)

IO.puts ""
IO.puts "Now lazy enumeration:"

a
|> Stream.map(fn x -> IO.puts("x") end)
|> Stream.map(fn x -> IO.puts("o") end)
|> Enum.to_list

Running this in iex will produce:

Eager enumeration:
x
x
x
o
o
o

Now lazy enumeration:
x
o
x
o
x
o

Enum prints all the x’s first and then the o’s, whereas the Stream will print x and then o for every iteration

You can also do neat things with Stream.unfold/2 like wrap generating random numbers with an enumerable interface:

random = Stream.unfold(nil, fn _ -> {:random.uniform(100), nil} end)

# Get 10 random numbers
values = Enum.take(random, 10)

# Get only even numbers
even_random = 
  Stream.filter(random, fn e -> rem(e, 2) == 0 end)
  |> Enum.take(10)

# Make a long random number
long_random = Enum.take(random, 10) |> Enum.join

We use Enum with a stream as an argument for when we actually want to return the values, and Stream when we want to keep composing the stream. Be careful not to call Enum.to_list/0 on this random stream as it will probably crash and never return.


⤧  Next page