Blog RSS Feed

Archive for the ‘Bible’ Category

Read through Hebrews One Verse at a Time in 2023 with AI Help

Tuesday, December 27th, 2022

The 2023 Daily Cross Reference Bible Reading Plan (also an RSS feed) walks you through the 303 verses in the book of Hebrews one day at a time, six days a week, with a review every Saturday. It includes up to twelve of the most-popular cross references for each verse, as well as an AI-generated summary of how each cross reference relates to the main verse. Each day also contains an AI-written introduction and a concluding prayer that tie together the themes between the main verse and its cross references.

For example, one of the explanations for January 1 connects Hebrews 1:1 and Genesis 3:15 like this:

Both passages refer to God’s plan of salvation. Hebrews 1:1 refers to God’s promise of redemption through the prophets, while Genesis 3:15 refers to the promise of a Redeemer who would come to defeat Satan and restore humanity.

The quality of the content generated by the AI (GPT-3) feels generally comparable to the typical evangelical devotional; I review the generated content by hand before posting it.

In 2016, I proposed a digital-first Bible reading plan that goes through the Gospels in a year, including all the cross references for each verse. This reading plan is an implementation of this idea with an AI twist and exposes you to 2,295 different verses, or around 7% of the whole Bible.

If you’re looking for a full-year, brisk reading plan for 2023 that you haven’t done before, you might give the Daily Cross Reference Bible Reading Plan a try. You can bookmark that page, which will update itself every day, or subscribe to the RSS feed. To get a sense of what the content is like before committing, between now and January 1, the reading plan features some seasonally appropriate verses chosen by ChatGPT.

Exploring AI-Assisted Bible Study

Tuesday, August 2nd, 2022

AI-Assisted Bible Study is a new project that explores one way to apply an AI to personal Bible study, with AI-generated questions and prayers that apply to each chapter of the Bible. It helps you explore questions you might not otherwise ask, like “How have I let my livestock get in the way of my relationship with God?”

A screenshot of the tool, with books and chapters in a grid at the top followed by "Headings" and "Short Summaries" with AI-generated content.

What This Project Does

This project presents AI-generated content for each chapter of the Bible in eight categories: headings, summaries, prayers, journal prompts, and application, exegetical, observational, and discussion questions.

For example, here’s sample AI-generated content for John 3:

  • Heading: Jesus is the light and life of the world.
  • Summary: Nicodemus, a Pharisee and member of the Jewish Sanhedrin, comes to Jesus at night to talk to him. Jesus tells him that he must be born again to see the kingdom of God.
  • Prayer: Father, we pray that we would have hearts like Nicodemus, that we would be willing to learn from Jesus.
  • Journal prompt: What do you think it means that Jesus said we must be born again?
  • Application question: What does it mean that “whoever does not believe stands condemned already” in John 3:18?
  • Exegetical question: What did Nicodemus misunderstand about being born again?
  • Observation question: What did Jesus tell Nicodemus he must do in order to see the kingdom of God in John 3:3?
  • Group discussion question: What did Jesus mean when He said that He must be “lifted up”? (John 3:14)

You can vote on content you find helpful or unhelpful. (I’m particularly proud of the CSS that handles the voting, which uses emojis as interface elements and doesn’t require any images. I’m also proud of the navigation, providing fast and compact access to any chapter in the Bible.)

How It Works

I prompted GPT-3 to generate text for each chapter in the Bible in each category. For example, the prompt to generate a prayer was:

Write 5 prayers inspired by John 3 in the Bible. Remember that the events described here are in the past. First include a short observation or lesson for each prayer, and then write a personal prayer related to the lesson.

I reviewed the generated text to avoid (or at least minimize) unhelpful or heretical content. I accepted about 90% of GPT-3’s suggestions on its first pass and regenerated the rest until it gave me something useful. It cost about $150 over six weeks to generate this content, which consists of 71,062 generations and 1.1 million words.

How It Doesn’t Work

Much of the content is useful—about the level you’d find in a typical group Bible study, with interesting insights mixed with odd and irrelevant content. When the content fails, it fails in four main ways:

  1. Heretical. This is the most severe category, which I tried most to eliminate. For example: “Help me to be like Judas and have the courage to betray Jesus when the time comes” or “What would it be like to be worshipped as a god?”
  2. Wrong. This is the hardest category to edit at scale. It includes factual errors (“David is forgiven, and Bathsheba’s son is healed,” “After Paul makes his defense, Agrippa finds him not guilty, but the Jews disagree and appeal to Caesar”) but also harder-to-discern, subtler errors like “What can we learn from Nahum 2:15-16 about God’s wrath?” (Nahum 2 only has 13 verses). Since I didn’t validate every reference, I expect that this category represents the bulk of unhelpful content. The project’s voting mechanism hopefully allows the helpful content to rise to the top over time.
  3. Confusing or very specific: “David rescues his family from Soup,” “How can I identify when someone is trying to lead a rebellion against me?” or the aforementioned “How have I let my livestock get in the way of my relationship with God?” It also likes to generate prayers for historical events as though they’re ongoing: “God, we pray for our leaders, that they would have wisdom to know what to do with the Book of the Law once it is found.”
  4. Vague: “What does Amos 3 reveal to us about God’s character?” or “What are the main points of Amos 5?” This content isn’t bad; it just doesn’t apply specifically to the passage.


In theory, GPT-3 could also generate on-demand answers to the questions it asks about each passage. Doing so would require giving visitors access to the AI, however, which (per OpenAI’s requirements) requires that I create a login system—not something I’m excited to do.

It could also create content at a smaller unit than a chapter (such as a verse or section). In my tests, the content it generated often proved superior to full-chapter content, but going smaller would’ve ballooned the costs of this project.


In my last post about AI-generated Bible art, I mused how the text- and image-generating AIs were doing most of the creative work, and I was just copy-pasting between them. That’s true, but in a larger sense, the AIs are allowing me to explore a possibility space faster and further than I would be able to on my own. As David Holz, the founder of Midjourney (another AI-powered text-to-image generator), says:

“It’s important that we don’t think of this as an AI ‘artist.’ We think of it more like using AI to augment our imagination. It’s not necessarily about art but about imagining. We are asking, ‘what if.’ The AI sort of increases the power of our imagination.”

Thinking of AI as an “imagination augmenter” captures that it’s not “creating” in the strictest sense but rather augmenting humans, allowing them to create at a speed and scale that wouldn’t otherwise be possible individually.

Therefore, this project tries to augment your imagination in your own Bible study.

Try out AI-Assisted Bible Study.

Visualizing Bible chapter similarity with Quid

Thursday, December 22nd, 2016

Quid, a natural-language processing and visualization startup*, last month produced a network graph of chapter similarity in the King James Bible. It does a good job clustering the gospels and epistles in the New Testament, though I might argue that you can largely distinguish them simply by the presence of “Jesus +said” vs. “Jesus -said”. Their full post details their methodology and colors the same visualization several different ways, including by sentiment and by popularity.

Read the article on Quid's site.
Credit: Quid

* They actually call themselves “a platform that searches, analyzes and visualizes the world’s collective intelligence to help answer strategic questions.”

Trending thirty years of Bible translations on Google Scholar

Sunday, November 22nd, 2015

Google Scholar keeps track of book citations, including citations of the Bible, in academic works. By crafting careful queries, we can try to identify trends in Bible translation usage among scholars:

Share of Bible Translation Citations in Google Scholar, by Year


First, the prevalence of the King James Version surprises me, since in general I’d expect biblical scholars to cite more modern translations in their work. However, it turns out that when scholars outside the field of biblical studies cite the Bible (generally for just a single quote), they’ll often use the KJV. Since Google Scholar doesn’t limit results to only religious scholarship, the KJV comes out on top.

Second, scholars prefer the NRSV and RSV more than the wider Christian audience does: the NRSV has held a roughly 12% share of scholarly citations since its introduction but is responsible for under 2% of Bible translation searches on Google. The RSV (and even the KJV) declined in scholarly share shortly after the release of the NRSV in 1989.

Third, the only translations to gain substantial scholarly share over the past thirty years are the NIV, the NRSV, and the ESV; the latter two are revisions of the RSV.


I constructed a spreadsheet of thirty-two major English Bible translations and how many citations they had each year from 1984 to 2015. (My favorite article that I came across discusses “Voldemort Phrases” (PDF)–the generic “he,” as in, “He Who Must Not Be Named”–in Bible translations.)

This methodology has several major limitations; therefore, you shouldn’t read too much into the exact numbers but should instead focus on broader patterns. Overall, however, the percentages largely match my expectations.

The first limitation is that the queries are imperfect: “ESV,” for example, can serve as the abbreviation for any number of phrases (e.g., “end-systolic volume”). While I tried to pick queries that appeared to yield relatively few false positives, they’re definitely still there. I couldn’t combine queries (e.g., [niv bible or “new international version”]), so the absolute numbers shouldn’t be taken too literally.

Second, Google Scholar’s definition of “scholarly” work is fairly loose; some of the fluctuations in certain translations may be the result of Google changing its scope over time.

Third, a straight counting approach, as here, doesn’t necessarily best represent scholarly influence. However, I couldn’t do anything more sophisticated since Google temporarily prevented me from accessing Google Scholar a few times for collecting even this basic data by hand. (They felt that it looked like I was running automated queries.)

Inspired by Metacanon.

The Bible on Twitter in 2014

Tuesday, December 30th, 2014

Bible Gateway recently shared their most-popular Bible verses of 2014, and I wanted to discuss this chart a little more:

Popular Bible verses by day in 2014 on Bible Gateway

The chart stems from the idea that if someone is equally likely to see a verse on any day of the year, each day should have 1/365, or 0.27%, of a verse’s yearly popularity. This chart shows days when there’s a spike in pageviews for each verse for a particular day (whenever it was over 0.4% of the annual total).

The theme of the chart is that people follow certain paths through the Bible during the year; I labeled a few of them on the chart. But there are definitely a few patterns I can’t explain:

  1. At the beginning of the year, two lines emanate from Genesis that look like they’re on track to read the full Bible in a year, but one of them is faster than the other. Why are there two?
  2. At the bottom right of the chart is a shallow line that looks like it involves reading Genesis and Exodus starting in May and ending in December. There’s a similar line in the New Testament running through Matthew from June to November. What are those?

I was curious whether the same patterns would appear in Twitter for the year, so I ran a similar analysis on the 43 million tweets this year that mentioned Bible verses. The answer is that, yes, you can see many of the same paths in both charts:

Popular Bible verses by day in 2014 on Twitter

They even include the same two (or three or four) fast readings of the Bible at the beginning of the year and the slow reading of Genesis and Exodus in the second half of the year. You can see similar peaks around the Passion stories leading to Easter and the Nativity story leading to Christmas. (Christmas is the last day that appears on this chart.) The Twitter chart more clearly shows the weekly rhythms of the devotional life, with vertical lines just barely visible every Sunday. The main difference is that there’s not as clear a path through the New Testament.

The Twitter chart also shows some horizontal bands where sharing is pretty light. These “sharing shadows” appear in the opening chapters of Numbers, 1 Kings, and 1 Chronicles.

Prolific Verse Sharers

A quirk of the Twitter chart is that some Twitterers tweet (and are retweeted) a lot. I suspect many of them are bots, but it’s hard to say whether they constitute “Bible spam”–many people do appear to find them helpful by retweeting them. The top fifty or so Twitterers are responsible for 16 million of the 43 million tweets this year. The chart doesn’t look too different if you remove them (mostly, the frequent repetition of Matthew disappears), but that just could be because I didn’t remove enough users to affect the results meaningfully. For all I know, this chart mostly just shows how Twitter bots share the Bible during the year. The consistency with the Bible Gateway data (in which I have more confidence), however, leads me to think that this picture is reasonably accurate.

Here are the top non-bot (as far as I can tell) sharers of Bible verses–these people tweeted the most Bible verses (and, more importantly, were retweeted most) throughout the year. Some of these people I recognize, and others… not so much. The “tweet” numbers reflect only tweets containing Bible verses and include others’ retweets of their tweets.

  1. JohnPiper (105,836 tweets)
  2. DangeRussWilson (87,382 tweets)
  3. WeLiftYourName (52,638 tweets)
  4. JosephPrince (50,889 tweets)
  5. BishopJakes (49,109 tweets)
  6. siwon407 (48,994 tweets)
  7. RickWarren (42,637 tweets)
  8. JoyceMeyer (39,703 tweets)
  9. jeremycamp (32,003 tweets)
  10. DaveRamsey (28,173 tweets)
  11. RCCGworldwide (26,731 tweets)
  12. AdamCappa (25,976 tweets)
  13. Creflo_Dollar (24,422 tweets)
  14. sadierob (20,068 tweets)
  15. Carson_Case (19,846 tweets)
  16. TimTebow (18,303 tweets)
  17. Kevinwoo91 (17,230 tweets)
  18. levimitchell (16,355 tweets)
  19. jesse_duplantis (15,755 tweets)
  20. kutless (14,806 tweets)

Most-Popular Verses

Here are the most-popular verses shared on Twitter in 2014:

  1. Phil 4:13 (613,161 tweets)
  2. 1Pet 5:7 (261,417 tweets)
  3. Prov 3:5 (218,019 tweets)
  4. John 14:6 (212,883 tweets)
  5. John 13:7 (207,084 tweets)
  6. 1Cor 13:4 (197,379 tweets)
  7. Matt 28:20 (187,407 tweets)
  8. Ps 118:24 (183,475 tweets)
  9. 2Tim 1:7 (182,758 tweets)
  10. Ps 56:3 (180,139 tweets)

You can also download a text file (411 KB) with the complete list of 2014’s popular verses.

John 13:7 (“Jesus replied, ‘You do not realize now what I am doing, but later you will understand.'”) is the oddball here, but it turns out that it’s mostly from over 100,000 retweets of a single tweet in April. (Since it was a one-off, I omitted him from the list of top sharers above, although his tweet count of 163,497 would put him in first place.)

How do the year’s most-popular verses compare among Bible Gateway, YouVersion, and Twitter? The answer: there’s a good deal of variation. Below are the top ten from each service; only Proverbs 3:5 appears in all three lists, and YouVersion and Twitter only have one verse that overlaps, which surprises me (given that they’re both based on sharing).

If we look only at Bible Gateway and Twitter, the average verse differs in its ranking by about 3,000 places, or nearly 10% of the Bible. The largest differences in rank: 1 Kings 20:14 is much more popular on Twitter (rank 4,380) than on Bible Gateway (rank 27,119), while Ezra 5:14 is way more popular on Bible Gateway (rank 13,995) than Twitter (rank 30,018).

Ranking Bible Gateway YouVersion Twitter
1. John 3:16 Rom 12:2 Phil 4:13
2. Jer 29:11 Phil 4:8 1Pet 5:7
3. Phil 4:13 Phil 4:6 Prov 3:5
4. Rom 8:28 Jer 29:11 John 14:6
5. Ps 23:4 Matt 6:33 John 13:7
6. Phil 4:6 Phil 4:7 1Cor 13:4
7. 1Cor 13:4 Prov 3:5 Matt 28:20
8. Prov 3:5 Isa 41:10 Ps 118:24
9. 1Cor 13:7 Matt 6:34 2Tim 1:7
10. Rom 12:2 Prov 3:6 Ps 56:7

Bold entries appear in at least two lists.

Data Source

The Twitter data is from Bible Verses on Twitter. A program connects to the Twitter Streaming API with a query for every chapter of the Bible (“Gen 1”, “Genesis 1”, and so on). I run a Bible reference parser on the tweet to pull out all the references. Then an SVM algorithm tries to guess whether the tweet is actually talking about a Bible verse or just happens to contain a string that looks like a Bible reference (“Gen 1 XBox for sale,” where “Gen” is short for “Generation”).

Sidenote: How I Calculate Verse Views

A note on methodology: I’ve never documented how I determine a particular verse’s popularity; now’s a good time, because you can do it a number of ways to reach different answers. Let’s say that someone is looking at Genesis 1, which has 31 verses. That counts as one pageview, but if you’re looking for the number of pageviews that, say, Genesis 1:1 receives, how do attribute a chapter-length view like this? You could give each verse credit for a full pageview, but then verses in long chapters will appear to have a disproportionately high number of pageviews. Instead, I prefer to divide the pageview into the number of verses in the passage: in this case, each verse in Genesis 1 will receive 1/31, or 0.032 pageviews.

Now, what if someone is looking at, say, Genesis 1:1 and Matthew 1 (25 verses) on the same page? In this case, I divide the pageview by the number of separate passages: Genesis 1:1 receives credit for a full 0.5 pageviews, as does Matthew 1. Each verse in Matthew 1 therefore receives 0.5/25, or 0.02 pageviews.

I feel that this approach best respects people’s intentions whether they want to look at multiple verses, several independent passages, or just individual verses.

Jim LePage’s Illustrations of Every Bible Book

Friday, November 11th, 2011

Jim LePage has just finished a two-year project in which he’s created an illustration for every book of the Bible. The always-underappreciated Obadiah is my favorite:

A giant hand reaches for a bird, with the caption, “Though you soar like the eagle, I will bring you down. Obadiah.”

Jim also runs Gettin’ Biblical, a site that showcases non-schlocky Christian-themed artwork. I particularly enjoyed The Savior collage and the papercut-esque Burning Bush. Good examples of “Christian art” (a difficult term to define if you’ve ever talked to artists who are Christians) are hard to come by, and I appreciate Jim’s efforts to collect them.

Update September 2016: Removed outdated link to Gettin’ Biblical.

Quantifying Traditional vs. Contemporary Language in English Bibles Using Google NGram Data

Monday, December 27th, 2010

Using data from Google’s new ngram corpus, here’s how English Bible translations compare in their use of traditional vs. contemporary vocabulary:

Relative Traditional vs. Contemporary Language in English Bible Translations
* Partial Bible (New Testament except for The Voice, which only has the Gospel of John). The colors represent somewhat arbitrary groups.

Here’s similar data with the most recent publication year (since 1970) as the x-axis:

Relative Traditional vs. Contemporary Language in English Bible Translations by Publication Year


The result accords well with my expectations of translations. It generally follows the “word for word/thought for thought” continuum often used to categorize translations, suggesting that word-for-word, functionally equivalent translations tend toward traditional language, while thought-for-thought, dynamic-equivalent translations sometimes find replacements for traditional words. For reference, here’s how Bible publisher Zondervan categorizes translations along that continuum:

A word-for-word to thought-for-thought continuum lists about twenty English translations, from an interlinear to The Message.

I’m not sure what to make of the curious NLT grouping in the first chart above: the five translations are more similar than any others. In particular, I’d expect the new Common English Bible to be more contemporary–perhaps it will become so once the Old Testament is available and it’s more comparable to other translations.

In the chart with publication years, notice how no one tries to occupy the same space as the NIV for twenty years until the HCSB comes along.

The World English Bible appears where it does largely because it uses “Yahweh” instead of “LORD.” If you ignore that word, the WEB shows up between the Amplified and the NASB. (The word Yahweh has become more popular recently.) Similarly, the New Jerusalem Bible would appear between the HCSB and the NET for the same reason.

The more contemporary versions often use contractions (e.g., you’ll), which pulls their score considerably toward the contemporary side.

Religious words (“God,” “Jesus”) pull translations to the traditional side, since a greater percentage of books in the past dealt with religious subjects. A religious text such as the Bible therefore naturally tends toward older language.

If you’re looking for translations largely free from copyright restrictions, most of the KJV-grouped translations are public domain. The Lexham English Bible and the World English Bible are available in the ESV/NASB group. The NET Bible is available in the NIV group. Interestingly, all the more contemporary-style translations are under standard copyright; I don’t know of a project to produce an open thought-for-thought translation–maybe because there’s more room for disagreement in such a project?

Not included in the above chart is the LOLCat Bible, a non-academic attempt to translate the Bible into LOLspeak. If charted, it appears well to the contemporary side of The Message:

The KJV is on the far left, The Message is in the middle, and the LOLCat Bible is on the far right.


I downloaded the English 1-gram corpus from Google, normalized the words (stripping combining characters and making them case insensitive), and inserted the five million or so unique words into a database table. I combined individual years into decades to lower the row count. Next, I ran a percentage-wise comparison (similar to what Google’s ngram viewer does) for each word to determine when they were most popular.

Then, I created word counts for a variety of translations, dropped stopwords, and multiplied the counts by the above ngram percentages to arrive at a median year for each translation.

The year scale (x-axis on the first chart, y-axis on the second) runs from 1838 to 1878, largely, as mentioned before, because Bibles use religious language. Even the LOLCat Bible dates to 1921 because it uses words (e.g., “ceiling cat”) that don’t particularly tie it to the present.


The data doesn’t present a complete picture of a translation’s suitability for a particular audience or overall readability. For example, it doesn’t take into account word order (“fear not” vs. “do not fear”). (I wanted to use Google’s two- or three-gram data to see what differences they make, but as of this writing, Google hasn’t finished uploading them.)

I work for Zondervan, which publishes the NIV family of Bibles, but the work here is my own and I don’t speak for them.

Evaluating Bible Reading Levels with Google

Saturday, December 11th, 2010

Google recently introduced a “Reading Level” feature on their Advanced Search page that allows you to see the distribution of reading levels for a query.

If we constrain a search to Bible Gateway and restrict URLs to individual translations, we get a decent picture of how English translations stack up in terms of reading levels:

According to this methodology, the Amplified Bible is the hardest to read (probably because its nature is to have long sentences), and the NIrV is the easiest.

Caveats abound:

  1. URLs don’t have a 1:1 correspondence to passages, so some passages get counted twice while others don’t get counted at all.
  2. Google doesn’t publish its criteria for what constitutes different reading levels.
  3. These numbers are probably best thought of in relative, rather than absolute, terms.
  4. Searching translation-specific websites yields different numbers. For example, constraining the search to results in 57% Basic / 42% Intermediate results for the ESV, massively different from the 18% Basic / 80% Intermediate results above.

Download the raw spreadsheet if you’re interested in exploring more.

Presentation on Tweeting the Bible

Friday, March 26th, 2010

Here’s a presentation I just gave at the BibleTech 2010 conference about how people tweet the Bible:

Also: PowerPoint, PDF.

I distributed the following handout at the presentation, showing the popularity of Bible chapters and verses cited on Twitter. It displays a lot of data: darker chapters are more popular, the number in the middle of each box is the most popular verse in the chapter, and sparklines in each box show the distribution of the popularity in each chapter. (Genesis 1:1 is by far the most popular verse in Genesis 1, while Genesis 3:15 is only a little more popular than other verses in the chapter.)

The grid shows the popularity of chapters and verses in the Bible as cited on Twitter.

750 Memory Verses for You

Wednesday, May 13th, 2009

Download 750 verses (or short combination of verses) for your memory work or as the basis for your own list of verses. The Bible text in the file is the ESV and is therefore copyrighted, but the compilation of actual verse references is available under the usual (for CC-BY license. Do what you like with them.

The verses appear in descending order of popularity (as determined by an analysis of verses on this site), so if you’re only looking for 100 verses, you can just grab the first 100. There are 750 total to give you more than two years’ worth of daily verses, if you want.


When I was putting together a list of daily memory verses a few years ago, I was surprised to discover that I couldn’t find a freely usable list of verses: I ended up combing through a bunch of books and combining various sources to produce a list of about 400 verses. If you’re doing something similar, why should you go through the same pain?