Hacker News new | past | comments | ask | show | jobs | submit login
Official sequel to Cracking the Coding Interview is out (interviewing.io)
38 points by stmw 17 hours ago | hide | past | favorite | 73 comments





I don't think this book or any books for that matter are particularly useful for interviews unless you want to learn the extreme basics. However, if you are trying to learn the basics, I'm not sure that this book or EPI, etc. is enough to be prepared for a real interview.

The best way really is to grind Leetcode, unfortunately, and using ChatGPT to ask questions and explain answers or topics. ChatGPT is very good at explaining well-defined things like CS topics and LC answers in my opinion. It most likely slurped the contents of this book and you can ask it questions directly so I think that's probably the way I will do it in the future.

In addition, the book doesn't seem to cover systems design which is an extremely critical part of the interview process for L4 and above.


The book is not just basics. I've read most of the technical chapters and it is quite in depth. More so than any online resource. Plus there is the AI interviewer which is free and probably better than trying to prompt chatgpt yourself.

Agreed. The book talks about niche topics in more depth than most resources online. In addition to all the topics in the book, we're actively releasing free online chapters each month covering more niche topics found in interviews.

> trying to prompt chatgpt yourself

These days, I can't even be bothered. Sometimes I type 2 words may be 3 and o3 will do all the CoT style prompting on its own. I don't mind the wait.

o3 is wonderful in exploring codebases, especially on GitHub Chat.


> In addition, the book doesn't seem to cover systems design which is an extremely critical part of the interview process for L4 and above.

It's called Cracking the *Coding* Interview for a reason. We don't talk about databases or concurrency or system design or meta-programming or any other topics that would take an entire other textbook to cover. It would be impossible to do it justice.


These kinds of books have really progressed since they first appeared -- this one could probably be fairly re-marketed as a two-volume set of 1. an advanced CS textbook and 2. a SWE candidate handbook, especially given the online resources and AI interviewer they have.

I have heard of people saying grind leetcode but that never made any sense to me. Does leetcode go over database design and development basics? The best jobs I ever had never needed me to grind on some meaningless test because they knew how to conduct quality interviews.

This book is definitely the opposite of "grind leetcode", fwiw.

Appreciate the comment, and you're completely right. Our whole thesis is that doing what everybody else is doing will get you the same results that everybody else is getting. :)

Oh I know, but my point is not every interview is covered by this process that seems to screen for college grads or people who memorize things better.

>Technical interviews are much harder today than they used to be. Engineers study for months and routinely get down-leveled despite that.

Is this because of an interview prep arms race, or supply and demand, or what?

Is all of that interview prep, and higher standards for hiring, translating into developers who are stronger on the job? Or at least, stronger when the job calls for data structure skill?


Hey, my name is Aline. I'm one of the authors of Beyond Cracking the Coding Interview and the founder of interviewing.io.

I have also wondered about the arms race and how it weighs against market forces for the bar going up, especially because as founder of an interview prep platform, I am complicit in said arms race and don't feel great about it (though I think the good of starting interviewing.io far outweighs the bad, but that's a story for another time).

So I looked at the data.

Between 2015 and the first half of 2022, I'd argue that "the bar" was about the same, even though a bunch of interview prep resources sprung up during that time (interviewing.io was founded in 2015, Leetcode was, Pramp was, Triplebyte was, HackerRank was a few years earlier, the list goes on).

Then, the tech downturn happened in 2022, and all of a sudden the bar jumped... because for the first time companies didn't feel like there was an acute shortage of candidates.

Here's some the data about the bar. At interviewing.io, after each interview, whether it's mock or real, the interviewer fills out a rubric. The rubric asks whether you'd move the candidate forward and also asks to rate them on a scale of 1 to 4 on coding ability, problem solving ability, and communication.

We can look at what the average coding score was over time for passing interviews to see how much the bar has gone up.

Between 2016 and 2022, the bar grew a bit (from 3.3 to 3.4). Starting in 2022, it shot up to 3.7. Is this the be-all and end-all? Of course not. But to me this is compelling data that market forces >>> the interview prep industry.


Hi Aline, thanks for the reply. That makes a lot of sense.

Even so, I do think it's worth considering what someone in your position can do to make things better. If an arms race is inevitable, can it at least be an arms race with positive externalities?

For example, if companies focused much more on security questions during interviews, that would create an incentive for devs to learn about security. Then we'd have more secure software as a positive externality -- in theory, at least.

If we could get companies to ask more questions about AI alignment, that could reduce risks from AI misalignment.

If we could get companies to ask more questions about optimizing the energy usage of apps and data centers, that could be good for the environment.

The pitch to hiring managers would be something like: The algorithms interview is not about algorithms per se. Algorithm knowledge is only somewhat useful on the job. Rather, the algorithms interview is about giving the candidate a chance to signal that they can master technical coding knowledge. It doesn't particularly matter what that technical coding knowledge is.

And, if you ask questions on topic besides the classic data structures and algorithms, that means you're measuring something different:

* You're measuring the candidate's passion to learn CS stuff that's not usually covered in interviews.

* You're measuring the candidate's ability to pick up something new on the fly.

* If you're transparent, and you publicize the topic(s) you interview for, you're measuring the candidate's passion for getting hired at your company in particular.

All of those measurements seem potentially more valuable than measuring how much time they had to study classic algorithms topics.


I love this comment, and as an author of this book, I don't disagree with a word of it.

A common misconception is that Gayle's original book put forth the "right" way to do interviews. Gayle neither invented or encouraged the current interview structure. Gayle discusses the timeline in more depth in a Blind AMA thread you can find online. I think a lot of people are under the impression that books like thes somehow steer the interview process toward this style of interview. At this point, all we are doing is looking at the process as it is TODAY, and trying to help provide transparency and equal information to everyone. We spend several chapters in this book talking about how broken the process is and making similar points to you, but we can't write a book on an interview process that doesn't exist and while Gayle's original book is well-circulated, she (or any of the rest of us authors) doesn't have sway over how big tech companies conduct their hiring.

With that said, I think we are seeing companies start to incorporate other interviews precisely for the reasons you've mentioned. It isn't uncommon for smaller tech companies especially to have a DS&A interview, but also include a system design interview, and maybe even a practical "build something simple like a tic-tac-toe game in front of me while I watch" kind of interview. I do believe things are getting better and more fair over time (remember two decades ago Google was literally asking riddles in an attempt to screen people). I don't buy into the narrative that these interviews should go away entirely (and if they did, it would take at least a decade) because they are still a reasonably effective way to interview people at scale. The Pragmatic Programmer guy actually had a great take on this here: https://x.com/GergelyOrosz/status/1891212829346435103


100%. I agree with everything you said. The interview is supposed to be about whether someone can master new stuff. I've heard good interviews described as "being smart together". Drilling on questions flies in the face of that.

Here's the problem. It's really hard to get companies to change.

A bit of a long-winded answer that I hope will come back around...

I don't feel great about my role in perpetuating the interview prep industrial complex. It wasn’t my intention – to this day, interviewing.io is first and foremost a hiring company that’s trying to make eng hiring efficient and fair (though bc of the downturn we've been doing less hiring than we'd like). Our goal is to find the best performers, regardless of who they are and how they look on paper, and get them into any company they want.

So why do we do mock interviews?

Mock interviews were supposed to be a way to attract people to our platform, not an end in itself. Over time, though, mock interviews have become a larger and larger part of our business, and that’s part of the reason I wanted to write this book. I'm hopeful that changing the conversation around prep and empowering engineers to learn the material rather than memorize the material will make a dent in an industry that is, as of now, headed in the wrong direction.

The other thing we’re doing at interviewing.io is gathering a massive interview data set. Unlike other mock interview platforms, we don’t tell our interviewers what questions to ask or how to interview. Instead, we let each interviewer run their own process. This means that we end up with a lot of “interview biodiversity” on our platform.

I'm hopeful that, over time, we’ll be able to use that data to do a retroactive analysis where we look at users’ outcomes and which mock interviews they passed and failed to figure out which types of interviews carry the most signal and, over time, conclusively come up with some best practices around the best way to surface good engineers. Because it can’t be what our industry is doing now!

The other piece of the puzzle is having enough mindshare, when we do figure out the answers, to have people listen. We've already blogged a LOT about what companies can do to hire better. Here's our collection of posts specifically for employers and how they can hire better: https://interviewing.io/blog/category/for-employers-how-to-h...

The reality is that employers don't care. They won't change their behavior unless you're somehow making things 10X cheaper or faster. So that's what we want to do, both with respect to how companies hire (making it be about what people can do instead of the brands on their resumes) and with respect to how companies interview (hopefully moving away from toy algorithmic problems that people can drill on and memorize).


Hey there, I'm the Mike (an author of BCtCI). I totally get where you’re coming from—there are a lot of flaws in the interview process, no doubt about it.

When Cracking the Coding Interview was originally written, the intention wasn’t to claim, it as a perfect process and to give ideal questions that should be asked. In fact, there are many questions in the original book that I wouldn’t recommend any interviewer use. The real purpose was to shed light on a process that had been shrouded in secrecy for decades, long before the book existed.

The reality is, like them or not, these types of interviews aren’t going anywhere—whether or not resources like mine are available. So the real question becomes: Should the process remain some insider-only system where only those with well-connected friends know what to expect and how to prepare? Or should we make it more accessible, ensuring that everyone has a fair shot with similar resources?

For me, the answer is clear: transparency and equal access for all. It’s not about endorsing the process—it’s about making sure the playing field is level.


That's fair. I didn't mean to condemn you. Sorry if it came across that way. Honestly I was just curious. I'm in the (apparent minority) of devs who actually like solving leetcode-style problems, so the current system doesn't bother me a ton.

I wrote a reply to Aline which applies to you as well: https://news.ycombinator.com/item?id=43145953


Nah, I totally understand where you're coming from and didn't feel attacked. I responded to your reply! :)

So another one to consider:

It is not about a "higher" bar as much as a more "standardized" bar and more importantly to weed out the "uncompliant" ones.

Leetcoding style preparation is pretty much a 2-3 hour a day over 1-2 months ordeal (given or take). Which means those who are seeking the types who have the time to invest in this repeatedly. This translates your willingness to be part of the time no questions asked.

Worse this is not one off. No matter your expertise/experience level, unless you are doing this everyday for fun, you need to invest to stay sharp. Guess how many family folks actually have this kind of time?

I consider myself a reasonably experienced engineer -both from a ton of side projects for fun and with long stints at faangs + being on both sides of the interview table (note I said experienced, not "great") and i still have to devote 2-3 weeks of ~4 hours each day leetcoding each time I change roles. Would be impossible if family wasn't so forgiving!

Back to your point. These kind of interviews have little to do with bar and more about companies know they can put you through it and to hire those who are willing to give it all up for the co! They don't want (or know what to do with) the next woz. Someone who will follow the rubrics will do.


I’ve been tempted to get it and curious to hear any review of the book. I decided to get Alex Xu’s Coding Interview Pattern [1] instead which I’ve been really enjoying.

[1] https://blog.bytebytego.com/p/my-new-book-coding-interview-p...


Hey, I know Shaun personally, and I think his book is excellent. Our books, despite seeming similar, are very different. His is about recognizing preset patterns, and that is about the extent of it. No outreach advice, negotiation strategies, resume opinions, behavioral help, etc. His book also seems to focus heavily on quantity, as he ends up going through 100+ leetcode problems.

I'd heartily endorse his book, but they definitely are more different than similar. EPI is also another excellent book with a very different style.

Here is a link to nine chapters of Beyond Cracking the Coding Interview for free in case you're still curious about it: https://bctci.co/free-chapters


Whoa, I appreciate the sneak peek PDF and response! CCTI was my first book I bought for grinding software engineer interviews. I’ll definitely revisit whether to get the new version after reading the first 9 chapters!

Happy to help. :)

The book also has a discord server (link in the free PDF!), so if you have other questions, feel free to ping me there with them! We are about to start the weekly leetcode contest, and there will be a write-up after it finishes on which techniques and templates we've used from the book, which might give you more of a sense of how different it is. Hope to see you there, my friend!


Your proactive responses has me sold! Just bought the book off Amazon. I'm looking forward joining the Discord/Leetcode Contest too. It's really hard for me to get my momentum going and hoping the contest will give me the boost I need!

Hey, thanks for the support. You can always reach out to me directly on Discord if you need anything. Nil and I wrote all of the technical content for the book, so we are happy to help in any way we can and do our best to stay available—especially to our supporters! :) I hope to see you there!

bunch of them reviews on amazon (which has benefit of showing who actually bought it), here is a very positive one:

"This is the best overall resource that I've seen for software engineering job searches and coding interviews. There's a lot of free online resources out there so I was on the fence about buying this, but this is the most high-quality and practically useful collection of tips, tricks, advice, and explanation of relevant CS data structures & algorithms concepts that I've seen. There's also a lot of practical advice on how to approach the job search. This, plus leetcode, would be my most highly recommended resources for coding interviews. I think the content of this book would be relevant and useful from the new grad level through the staff SWE level. It's also a totally different book than the original CTCI (which I also have), not just another iteration of the same thing. I never write reviews for anything, but I felt like I had to for this because I was surprised by how good this is."


I was really excited about the salary negotiation advice on interviewing.io, but it totally blew up in my face when I tried it for my last job offer. According to them this is a red flag of the business, but I'm not convinced their techniques apply equally well to all company sizes. Small companies just operate differently I think.

I think it might have come off as a bad cultural fit. Talking comp is appropriate always, but if you sound too interested in the cash and not the vision startups might not like it.

Expecting someone who isn't even working there to be super enthusiastic about your vision just turns it into a competition of faking it; I guess if that's what you're hiring for.

What did you try? A comp discussion is a totally fair thing to start

More detailed reply in another comment in this tree

Hey! Not sure if you used our negotiation service or just used advice from one of our blog posts.

I'll write a long response below, but obviously my long response doesn't help you if your offer got rescinded for negotiating. If it did, I am really sorry and take some of the blame. Would you mind sharing more of your story (especially if it didn't fall into "small headcount" scenario above) so we can do a better job about putting caveats on our advice?

That said...

In general, our advice holds for both large and small companies, but there are some caveats around 1) how much headcount the company has, 2) whether you're a jerk (can't imagine this was the case here!), and 3) whether you're negotiating directly with a founder.

- You’re applying for a role at a very small startup. In this situation, it’s very possible that the startup literally just has one headcount, and in that scenario, they may just go with the candidate who accepts their offer first. To avoid this scenario, ask how many open headcount the position has. If it’s small and if you’re junior, then skipping the negotiation portion makes sense.

- You’re a giant jerk during negotiations or you blatantly waste the company’s time. When we say “giant jerk” we mean being straight up rude to your recruiter. We do not mean advocating for yourself or asking for more money. And when we talk about wasting the company’s time, we mean drawing out the negotiation process for months, asking to talk to more than 5 people on the team, and still not being able to make up your mind.

Also, regardless of these two scenarios, negotiating with founders is very different than negotiating with recruiters. It's a long explanation, so I'll post it as a reply.


I did not use your service, and that might have made a difference.

I don't blame you at all, I just wanted to share my experience. Here are some relevant details:

During the interview process the company told me their budget for the position. The company has about 130 employees, about 40 engineers, so not super small but not large either. Upon receiving an offer, they offered me the max for their budget. It was an acceptable amount for my circumstances, but I also knew it was slightly below market rate. In an email, I wrote something along the lines of "I'm excited to join but I need to consider some other offers. I would sign today for a 10% base salary increase." They told me to take a hike. Unfortunately I was bluffing (no other offers) and I had recently been laid off and I needed a job, so I backtracked and was and to secure the position.

I think I made some mistakes in this process, such as

1. Not being honest with myself about whether or not their max salary was acceptable

2. Not securing multiple offers at the same time as recommended by interviewing.io (though in my defense I still don't know how to do this, recruiter teams work at extremely different speeds and I tried hard to get multiple companies to align - perhaps this is where the service would shine)

3. Bluffing about other offers when I didn't have leverage

I think reading all the articles and watching the videos on the site made me optimistic and feel like money is just waiting to rain on me if only I ask for it, so at the time I was disappointed that it didn't work as expected. It's been about a year and a half since that experience and at this point I'm at peace with it, but I think I'll behave slightly differently the next time I'm in a negotiation situation. My biggest takeaway is that if a company states a budget, it's valid to assume that its a hard upper limit.


I also should have said in my top level comment, other content on the site was absolute gold and helped me a ton, particularly the system design info. I appreciated the site as a whole during my job search.

Ok, there are some things to keep in mind when negotiating with founders rather than recruiters. (I just grabbed this from the book and edited it down a bit.)

At young startups (seed and possibly Series A), there often isn't a recruiting team. Until a startup hires a recruiting team, usually one of the founders handles important hiring-related tasks, such as negotiation. Even if they do have one or two recruiters on staff, sometimes the founder hasn’t handed over the negotiation reins to them yet.

In case you do find yourself negotiating with a founder, here's how to handle it because if you treat them just like recruiters, the results could be disastrous.

Here’s the most important difference between recruiters and founders. Recruiters are doing a job, and at larger companies, they’re often following a script. Though we always advise you to be enthusiastic, gracious, and courteous (and we take great pains in our copy to do so), at the end of the day, there’s an undercurrent in our negotiations: if you read between the lines, we’re practically shouting, “Look! Here’s my leverage! I’m cold and calculating and indifferent! Money talks!”

For founders, everything is personal. The company is their baby, and talking to them like they’re a recruiter may upset them and make them think you’re wasting their time. Moreover, many founders are not experienced at hiring, especially if it’s their first company, and may not have had many candidate conversations. The playing field between you and a founder is, in this way, much more level than it would be with a recruiter.

Here’s a real story to drive these points home.

One of our users had just wrapped up the onsite at a seed-stage startup. The final step was to meet (once more) with the founders. During the call, they asked him where else he was interviewing. Using our playbook, he was deliberately vague and said that he’s interviewing with a number of companies, at various stages, and is unable to share more information at this time.

The founders dug in and asked several more times if he was interviewing with big tech companies. He held the line, even though the founders were visibly surprised in the meeting that he wasn’t more forthcoming.

After the meeting, the candidate got an email from their recruiter saying that they wouldn’t be moving forward because “we aren’t sure what you’re looking for.”

What happened here? Here’s my best guess.

After spending several weeks of interviews telling them how interested he was in their startup, the founders were shocked to find that he wasn’t fully bought in and was still considering other companies. Moreover, going into the conversation, the founders were probably worried that this candidate was interviewing at FAANGs and other big companies and that there’s no way they’d be able to match big company compensation. It’s a real thing.

As such, they made the calculated decision to stop investing in this candidate because they’d be unlikely to close him.

As a former founder, I can relate to the visceral irritation that comes with feeling like someone is wasting your time. But as a former founder, this decision strikes me as naive – if a candidate is spending time with you, it’s on you to sell him, and you sell til the last minute! However, founders don’t always feel this way. Their time is limited and the opportunity cost of any time spent is very high. Moreover, as we discussed earlier, they’re not recruiters – they have less experience navigating these conversations, and for them it’s personal.

So how do you deal with founders? Here’s what we recommend. Incidentally, the advice below assumes that you’re OK with the cash hit you’d invariably take when working at a startup and that you’re going into it with your eyes open. If you’re not actually OK with it, please don’t waste founders’ time! And if you’re using small startups for interview practice, shame on you. Not only is it unethical, but the types of questions and feedback you’ll get will likely be non-indicative of what you’ll see at big companies.

When founders ask where else you’re interviewing, you can say that you’re excited about this company and that you’re talking to a mix of startups and big companies.

If they ask you about your comp expectations and/or if they push on the big company thing, and especially if you’ve previously worked at one, say up front that you understand startups can’t pay the same thing as FAANG, and that you’re sure that they’ll make you a fair offer.

If they give you a compensation range at the beginning of the process, take it seriously. Large company ranges are often fungible (we’ve seen FAANGs go $100k+ above their ranges with proper negotiation). Startup ranges are much less so, when it comes to cash. You may be able to negotiate a lot more equity, but if you don’t want equity, think seriously about whether you even want to continue with the process.


I replied to your other comment but I think this your point at the end is my biggest learning from the process. Small(ish) companies don't have the luxury of significant swings in their budget. I think I assumed there would be some flexibility when there wasn't.

If I decide money is my biggest motivator for a new job the next time I'm looking, I'll probably sign up for your service to get coaching, I imagine getting guidance on navigating the FAANG process would be useful.


There is no value in the way technical interviews are being done - which is just a hazing ritual and what one remembers by practicing or rote memorization. If I don’t know how to center text or figure out how much rain water can be trapped using dynamic peogramming - doesn’t mean that I can’t design or architect a complex real-world system.

Plus the way these tech interviews are done, it is not inclusive for those who feel like fish out of water die to panic attacks and/or anxiety.


I can see how you’d get to that conclusion, but I don’t agree with this take. “No value” is particularly strong wording. This is a fairly simple, cheap, and fast process for screening people at scale (think “hundreds per week”), I agree that startups mostly shouldn’t rely on this type of interview (and most don’t from what I can gather).

None of us authors are advocating exclusively for this interview type. Designing real-world systems is great, which is why most big tech companies have system design rounds in their processes (except for new grad interviews).

Finally, speaking as someone with diagnosed severe anxiety and a specific disorder that causes frequent panic attacks, I completely empathize with being a “fish out of water” in this process. If it helps, you should know that most big tech companies have accommodations for such things depending on your needs (extra time, allowance of service animals in the interview room, etc). I’m not sure any interview process will be anxiety-free (it isn’t for “normal” people either), but through time and effort I have passed Google, Meta, Amazon, and other big tech interviews that have this process. In my experience, these are hindrances to be addressed but not immovable blockers to passing an interview.


This is a good book but it’s more of a starter step. Not being able to solve the problems (which I think is the common case) should hint to you that deeper review of fundamentals is needed. Unfortunately there are not many resources that teach the fundamentals lucidly, so extensive searching is required.

Algorithms by Sedgewick is pretty good for learning the fundamentals, imo, and sufficient (together with the Cracking the Coding Interview book) to pass interviews at the FAANGs I've worked at (and was an interviewer at for years).

Oooh, I'm a BCtCI author, but I admit Sedgewick is the algo expert! His book is more about data structures and algorithms than about coding interviews (related but very different).

If you needed to completely learn data structures and algorithms from scratch, I would NOT recommend his book, but instead, his FREE COURSE since it is so much more visual (and free): https://www.coursera.org/learn/algorithms-part1

For what it is worth, taking a course on DS&A is very different than getting good at these interviews for most people. The two are closely related but very different.


I'm Aline, one of the authors. We actually doubled down on teaching the fundamentals in this book. If you've read it, please tell us where we could have done better.

If you have not read it, check out two of the technical chapters to see our approach to teaching binary search and sliding windows and let us know what you think: https://bctci.co/free-chapters


Sorry the opinion was based on the prior version of the book. I took a look at the link. My honest opinion is that binary search is kind of rushed in its break down. On the other hand, I really liked the explanation of the various sliding window problems since that is a topic that is normally not covered in CS fundamentals. I also like the pseudocode -- it reads well. I think a book focused on concepts could serve as a delightful companion to this book. Or simply add the pages. Thanks, i'm sure you will sell tons of copies.

Hey, I'm one of the main authors of the book. Feel free to ask any questions if you have them. :)

There are many great resources that have come out since the original CtCI, but we waited to release this until we had substantially new advice to give that is different from other options.

TL;DR This book teaches you how to think, not memorize questions. And how to reason about your job search and handle recruiters

Here are nine chapters from the book that you can read so you can make your own decisions. They include:

- Seven non-technical chapters that walk you through important topics such as why technical interviews are broken, what recruiters won't tell you, why you should not spend a lot of time on resumes, and how to get in the door at companies without a referral.

- Two technical chapters covering the two easiest-to-mess-up-in-an-interview topics. Binary search & Sliding Windows. Our new take on Binary Search teaches one template that works for every binary search problem on Leetcode, with only a single-line change you need to remember. The Sliding Windows chapter features 6 unique sliding window templates that make off-by-one errors a thing of the past.

https://bctci.co/free-chapters


I used interviewing.io's resources for my most recent job search; however:

1. Even once I studied enough to know how to use the more challenging algorithm techniques (dynamic programming, graphs, the more esoteric versions of sliding window, etc.) that still wasn't enough to pass interviews. The time I was given to solve the problems was so short that it would take me at least a month of practice to get fast enough at just one of the techniques.

2. It's difficult to predict what interviewers will ask. In a six month interviewing period I might only get asked a given advanced topic once.

I eventually gave up on becoming sufficiently proficient in more than the basic algorithm techniques. I couldn't realistically prepare for each of the algorithm techniques on interviewing.io and I couldn't predict which I was likely to be asked. It was just luck.

So I found that the best strategy was just apply to as many jobs as I could and hope that I didn't get asked one of the more advanced topics. In effect, if I'm reliant on luck either way, then I may as well rely on the approach that gives me more time to put in applications rather than studying.

I guess my question is what the intended use of this sort of book is considering the world we find ourselves in? That is to say: In an ideal world where interviewers care about demonstrating competency, then taking your approach of learning to understand the basic patterns that make up these questions makes a lot of sense (and would probably make people better programmings). However, in our current world where interviewers demand raw speed, that doesn't seem viable.


Shoot me an email at aline@interviewing.io if you'd like some mock interviews on us. I'm sorry we didn't get you there... but if it's a matter of more reps to get your speed up, happy to get you those for free.

I hope Mike jumps in to answer your question about where the book fits in, in a world where speed matters more than it used to.


I added my response in this thread! :)

Hey, thanks for the thoughtful question! You’re absolutely right to point out something that a lot of coding interview experts don’t like to admit: interviews aren’t always predictable. In fact, one of the first sections in the book has each of us sharing a story about an interview we totally bombed. We also spend several chapters breaking down how flawed technical interviews can be.

The reality is, there’s no “interview police” making sure interviewers are asking great questions or grading on the right things. Speed, for example, is often overrated—it usually comes at the expense of correctness. While the interviewing.io learning center covers general patterns, our book takes things further by organizing topics based on dependencies, likelihood of being asked, and difficulty. We believe that the order in which you learn these topics really matters. Of course, most people won’t aim to learn everything—the focus should be on what’s most relevant given the time before an interview.

To answer your specific question, the book assumes you’ll typically have around 20–35 minutes to solve a problem completely. Some companies approach things differently—Meta, for instance, often asks two questions per interview and places a bigger emphasis on speed. That said, Meta interviews tend to be more straightforward these days, with a common study strategy being to sort tagged Meta-questions on LeetCode and work through the top 100. This is literally the process they encourage.

The good news? We’ve got solid data showing that most big tech interviews (outside of Meta) don’t put nearly as much pressure on speed. Of course, statistically, there will be other people expecting two questions per interview, but this isn't as common as the average Blind post would you have think.

We discuss speed in the book and how to get better, but it is similar to the old Marksman adage, "Slow is smooth, Smooth is fast." Speed comes with time, and without knowing how much time you already put in (or how you've been practicing), it is difficult to say a lot more beyond that. Feel free to ping me on the interviewing.io server if you want to discuss this further and I might be able to help.

EDIT: One extra thought. The resources you used to prepare on interviewing.io are very different from what is in this book (and the book content I'll say with humility is much better—and I wrote a lot of both). Don't take our word for it. You can check out the binary search topic in the interviewing.io Learning Center (which you can see I wrote) and then check out our binary search chapter at the link below for free (which I almost mainly wrote). You'll find the book materials are highly divergent (for the better).

https://bctci.co/free-chapters


Isn't it a bad time to invest in this? If an AI can help you learn the material, it can probably replace you pretty quickly.

This book mostly teaches things that I don't think AI can...

But also -- seemingly-silly things can help build foundation for more human learning.

Multiplication tables seemed to have an important role for centuries -- they don't turn you into Einstein, but I bet he knew what 2x5 equaled pretty early in life.

Here's just one Amazon review on this:

"This is the best overall resource that I've seen for software engineering job searches and coding interviews. There's a lot of free online resources out there so I was on the fence about buying this, but this is the most high-quality and practically useful collection of tips, tricks, advice, and explanation of relevant CS data structures & algorithms concepts that I've seen. There's also a lot of practical advice on how to approach the job search. This, plus leetcode, would be my most highly recommended resources for coding interviews. I think the content of this book would be relevant and useful from the new grad level through the staff SWE level. It's also a totally different book than the original CTCI (which I also have), not just another iteration of the same thing. I never write reviews for anything, but I felt like I had to for this because I was surprised by how good this is."


I think that a lot of the benefit might be having a defined plan for approaching a job search. E.g. what specific topics do you cover and what topics don't you cover? For a given topic, how in depth do you go and what aspects do you cover? Etc.

I could see using AI to research a given topic or walk you through a given problem. However, at this point I don't think AI is up to packaging up a well rounded textbook level study on a topic.


AI can absolutely help you. A lot of the benefits of the book come from the reasoning, though, which AI cannot do. For instance, you might try a question, and the optimal answer uses a heap, but how did we intuit that? What repeatable reasoning steps could we walk through to do that for a different problem? That's what 500 pages in this book is about. XD

You are incorrect. ChatGPT will absolutely tell you why the optimal answer would use a heap. I've found that ChatGPT is much, much better than any book resource because it amalgamates answers from everywhere into a coherent answer and you can keep asking questions about it to gain deeper understanding, as opposed to a book where if you don't understand that's it.

Hey friend, totally okay if we don’t see eye to eye on this. No big deal. But just to share my perspective—if you ask ChatGPT to break down a genuinely hard, new problem (not something that’s been around forever with tons of tutorials and blog posts), the explanations tend to stay pretty surface-level. For example, you might get something like, “We need to use DFS because we need to search the graph.” It doesn’t really get into the deeper reasoning behind why that’s the right approach or what led to this decision when others were possible.

There’s actually some interesting data on this here: How hard is it to cheat with ChatGPT in technical interviews?

Even AI experts point out that parroting tutorials isn’t real reasoning, and that’s still a tough spot for AI.

All that said, even if AI improves, I still think this book offers a lot of value. It’s packed with new templates and practical strategies to help you get unstuck during interviews—stuff based on data from over 100,000 mock interviews. No pressure if it’s not your thing, but if you’re curious, you can check out some of the technical (and non-technical) chapters at the link below. They cover approaches you definitely won’t see ChatGPT come up with. :)

https://bctci.co/free-chapters


Good to see a thriving industry based on the discontent of people rather it not exist

and the follow-up guide, How to send screenshots to ChatGPT so it doesn't look like you're copy and pasting the question text.

The only coding interview left is "here's the question, here's the code ChatGPT generates, what's wrong with it?"


Just go into the interview wearing a “hearing aid” a fellow confidant waits outside in an unmarked white van feeding you the answers.

This comment section, saturated with people marketing this book, is absolutely bizarre and distasteful.

This really feels like stepping over a line from sharing content to blatant advertising.

From the guidelines: > Please don't use HN primarily for promotion. It's ok to post your own stuff part of the time, but the primary use of the site should be for curiosity.


If it helps, we (the authors) didn't even post this to HN (or ask someone else to). We definitely will answer questions as they come up though. And we were giving away free chapters from the book before today and are just linking to it. I don't see anything we're doing that is breaking the guidelines. Lots of authors chime in when their posts/articles/books are brought up.

> Technical interviews are much harder today than they used to be. Engineers study for months and routinely get down-leveled despite that. Beyond Cracking the Coding Interview, in addition to covering a bunch of new questions and topics, teaches you how to think instead of memorizing. Grinding and memorization isn’t the way in this market (though in fairness, it’s never really the way). With us, you’ll still have to do the work, of course, but we’ll teach you to work smarter.

It's like they're doubling-down on the steaming load of BS that is the techbro interview.

Please permit me to share my 'review' of the original book, which I made when I gave away my copy on an MIT-internal free-stuff email list.

> Date: Friday, October 20, 2023 7:53:39 AM

> Subject: [Reuse] "cracking the coding interview" book of satan

> "Cracking the Coding Interview" 6th Edition

> I literally just opened this book to a random page, about 1/3 of the way through, and the first thing I read on the page was, "You should first ask your interviewer if the string is an ASCII or a Unicode string. Asking this question will show an eye for detail and a solid foundation in computer science." With no hint that the writer was aware of the irony.

> However, since some successful tech companies were founded during the dotcom boom by students with no professional experience, they adopted hiring filters based around a cocky student's idea of what is important in software engineering. Then, subsequent startups, without the benefit of dotcom gold rush handing gobs of money to anyone who could spell "HTML", looked at those companies that had a lot of money, and thought "I, too, want a lot of money, so I should mimic whatever that big company with tens of thousands of worker drone employees does." Now we have a whole lot of startups who are filtering their hiring with a mix of fratbro hazing rituals, and attacking anyone who climbs the ladder in the middle of the room to reach the bananas.

> This book will tell you not to climb the ladder to reach the bananas. Simultaneously, it will rot your brain, crush your soul, and make you a terrible person when you're later on the other side of the interviewing table and you inflict it upon someone else.

> Unfortunately, I have old-school conditioning against destroying books,no matter how offensive and harmful.

> TO CLAIM: Please email me and say *which day* you promise you will no-contact pick up, in Mid-Cambridge.

> TO NOT CLAIM: Kudos, and maybe you join me in telling companies who recommend this book as interview prep material, "No, thank you".

(Epilogue: That copy of the book went to someone who responded, "As a linguist (and ex CS person) who is now considering selling her soul to big tech because academia is not a lot more better, I'd be interested in this cursed book!")


I'm sorry, this is just non-sensical -- I do not mean to say that all interviewing practices are good or that there is not a lot of great humor in your post giving away the original book -- but as someone who has hired many software engineers and persevered through some coding questions as a candidate, these things have value. IT's nothing about "worker drone" or "rot your brain" -- companies need some ways to see if people can 1. code and 2. collaborate before adding them to the team. Candidates need ways to understand what those hiring processes (not just the coding question) are like and to have a fair chance to prepare for them.

I'm going to quote again part of that passage to which I literally flipped open the book:

> Asking this question will show an eye for detail and a solid foundation in computer science.

Which question an experienced programmer who developed string routines in a low-level language might well ask, and maybe that's who the authors stole it from as they were collecting material for the book, so that readers of the book could then BS and pretend to have thought to ask it, like this author recommends, although the author should also know it has almost nothing to do with CS.

Perhaps one would protest that this bit isn't representative of the book. Feel free to argue that God was guiding my hand, to the only bit of poo in the book. But then you have to ask yourself why God thought that taking down this book was so important, for Him to intervene.

So I actually skimmed much of the book, and I'm confident in saying that it's full of poo.

Initially, this idea that people should have study hundreds of pages of material specific to interview rituals was maybe a little too familiar, to people of a certain socioeconomic class. Like an admissions-gaming "prep" service, which in the past you'd give to affluent kids, to give them special advantage in getting into a prestigious school, if you couldn't afford to buy the school a new building.

OK, we could be wrong about the snotty pedigree angle, so let's consult the author's bio on the book's Web site:

> Gayle is the Founder / CEO of CareerCup.com and the author of two books: Cracking the Coding Interview, Amazon.com's #1 best-selling interview book, and The Google Resume. She has worked for Google, Microsoft, and Apple and served on Google's hiring committee. She holds a BSE and MSE in Computer Science from the University of Pennsylvania and an MBA from the Wharton School.

Which makes it look like not only a full-of-poo Ivy fratbro gatekeeping ritual, but -- plot twist! -- one that then was then cleverly sold out, by an MBA, to turn it into what's effectively a protection racket. (Youse have a lovely career potential; its would be a shame if somethings was to happen to it, because you didn't buy the book and memorize these disingenuous BS performance rituals.)

Back to your point: you don't need this full-of-poo book, nor the full-of-poo smug halfwit nonsense behind it, to assess "if people can 1. code and 2. collaborate". That's not what this book is about. You instead have to be smart, and stop forcing people to be full-of-poo corporate drone performers.

Hey, you know what, if you hire people like this book, even if you had a gazillion dollars, you'd have to throw away almost every new product they develop, for over a decade, and then eventually some upstart will come along and overnight destroy the single gazillion-dollar product (gifted by ancient ancestors, from before BS took over), which you've been coasting on all that time, because you're telling everyone from the very start, at the interview stage, that the company is about acting full-of-poo.


> Asking this question will show an eye for detail and a solid foundation in computer science.

My friend, you've clearly got the wrong book. I think there might be some confusion here. That exact sentence (as you've written it) doesn’t appear in our book—I just double-checked the digital version. None of the phrases “eye for detail,” “solid foundation in computer science,” or “asking this question” show up, either individually or together in our entire book.

If you're saying it is in the original Cracking the Coding Interview... ok? The book is a decade out of date at this point, and the originally linked post makes it clear that this book is very different from that one.


> My friend, you've clearly got the wrong book.

This thread is speaking of the original book.

> The book is a decade out of date at this point,

Companies continue to tell applicants to get the original book.

And the original book pretty much defined the current widespread techbro hiring idiocracy.

If the new book turns out to be a reversal, and begins with a passionate apology for earlier harm, and is made freely available (rather than double-dipping), and the original author goes on speaking tours (on their own dime) to recant to anyone who will listen, that would be a good start.


Ok friend. :)

> > You should first ask your interviewer if the string is an ASCII or a Unicode string.

UTF-7 they reply, with a twisted glint in their eye.


You got me! Hahaha :)

10/10 ! :)

I refuse to read this book and I encourage everyone here to do likewise. Don't support this nonsense.

This from a user whose previous HN comments included:

"this will just encourage the further replacement of American citizens with H-1B's from India who will do whatever upper management demands, because they have no choice.

I look around my workplace and see Indians building simple CRUD APIs because allegedly Americans can't? lol."

??!


If we're pulling comment histories, you have quite a few comments linking to interviewing.io which is affiliated with one of the book's authors (Aline Lerner). Coincidentally, you've also posted pretty glowingly about this book. Do you have any connections to the authors that might be compromising your objectivity?

If it helps, I (Mike) don't know who @stmw is. I can at least say they aren't one of the four authors of this book.

That said, I don't think the reasonable conclusion to come to when someone is supportive of a product is that they are invested in it. Sure, suspicion is a healthy thing, but for the record you can look through my comment history and see I recommend LeetCode and a coding book called EPI. No compromising connections—I just like the products and let people know about them when I see people talking about the topics. ¯\_(ツ)_/¯


What point are you trying to make?



Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: