u/Nearby-Way8870

Why does my Python loop keep overwriting the variable instead of storing all the values?

I need some help understanding something about loops in Python that I can't get past.

I'm writing a loop that goes through a list of numbers, does a small calculation on each one, and I want to save every result. But after the loop finishes, my variable only holds the last value from the final iteration. Everything before it is gone.

Here is a simple version of what I'm doing:

result = 0
for num in [1, 2, 3, 4, 5]:
    result = num * 2
print(result)  # only prints 10, I want all results

I want to keep every value, not just the last one. I checked the FAQ and the index but couldn't find something that directly addresses this specific loop behavior for a mid-level beginner. I understand basic loops but I'm clearly missing something about how Python handles variable reassignment inside a loop.

Should I be using a list and appending each result? I tried that briefly but wasn't sure if that was the right direction or if there's a cleaner way I should learn first.

Been learning Python for about 5 months on my own. Not a complete beginner but still solidifying the fundamentals.

reddit.com
u/Nearby-Way8870 — 2 days ago

How do you actually get good at object oriented programming in Java?

This is a question I wish someone had given me a real answer to when I first started because most of the advice out there either stays too theoretical or just tells you to practice without explaining what productive practice actually looks like for OOP specifically. So let me break this down honestly based on what has actually worked and open it up for people with more experience to add their perspective.

The first thing I had to realize is that object oriented programming is not just a set of syntax rules to memorize. It is a completely different way of thinking about how you organize and structure a program. A lot of beginners including myself early on treat OOP like a checklist. Learn what a class is, learn what inheritance is, learn what polymorphism is, check the boxes and move on. But knowing the definitions and being able to actually design a program using OOP principles are two completely separate things. The definitions are just vocabulary. The real skill is knowing when and why to apply each concept to solve a real problem cleanly.

The biggest breakthrough for me came when I stopped thinking about code and started thinking about real world things first. OOP is fundamentally about modeling the world in a way that a program can work with. So before writing any code I started asking myself what are the actual things involved in this problem and what do they do and how do they relate to each other. If I am building a library system the things are books, members, and loans. A book has a title, an author, and an availability status. A member has a name and a list of borrowed books. A loan connects a member to a book and has a due date. That kind of thinking before touching code is where good OOP design starts and most beginners skip it entirely because they are in a rush to start typing.

Once you have that mental model then you start translating it into classes and the design feels natural instead of forced. The class is just a blueprint for one of those real world things. The fields are its properties. The methods are what it can do or what can be done to it. When you approach it that way the structure starts making intuitive sense rather than feeling like arbitrary rules someone invented to make your life harder.

Encapsulation is usually the first OOP concept people learn and also the first one people treat as just a formality. Making fields private and writing getters and setters feels like busywork when you are starting out. But the real point of encapsulation is controlling how the internal state of an object gets accessed and changed. If any part of your program can reach in and change an object's data directly you lose control over what state that object can be in and bugs become very hard to track down. Understanding encapsulation as protection of state rather than just a syntax convention changes how seriously you take it.

Inheritance is where a lot of beginners get excited and then immediately overuse it. Inheritance models an is a relationship. A dog is an animal. A savings account is a bank account. That relationship should be genuinely true before you reach for inheritance. The mistake people make is using inheritance just to share code between classes that are not truly related in that way. When you do that you create tight coupling between classes that makes your code rigid and hard to change later. A good rule of thumb early on is if you are not sure whether inheritance is right ask yourself if the child class truly is a more specific version of the parent class or if you are just trying to reuse some methods. If it is the latter look at composition instead.

Interfaces are something I underestimated for way too long. An interface defines a contract. It says any class that implements me must be able to do these specific things. The power of interfaces comes from the fact that different classes can fulfill the same contract in completely different ways and the code that uses them does not need to know or care which specific class it is working with. That flexibility is what makes large programs manageable and testable. Getting comfortable with interfaces early is one of the things that separates developers who can build real systems from developers who can only build small scripts.

Polymorphism builds directly on interfaces and inheritance and it is the concept that took me the longest to appreciate. The idea that you can write code that works with a general type and have it automatically do the right thing depending on the specific object it is working with at runtime is genuinely powerful. But you only start to appreciate it when you are building something complex enough that without it you would be writing massive chains of if else statements to handle every specific type separately. Build something like that once and polymorphism will never feel abstract again.

The most practical advice I can give for actually getting good at OOP in Java is to build projects that are complex enough to require multiple classes that interact with each other. A single class program teaches you almost nothing about OOP design. You need to feel the friction of poorly designed classes to understand why good design matters. Build something like a simple banking system with accounts, customers, and transactions. Build a basic inventory management system. Build a simple school management system with students, courses, and grades. These kinds of projects force you to make real design decisions and live with the consequences of those decisions as the project grows.

When your code gets messy and hard to add features to ask yourself why. Nine times out of ten the answer comes back to an OOP design decision you made earlier. Maybe you put too much responsibility in one class. Maybe you used inheritance where composition would have been cleaner. Maybe you did not define clear boundaries between your classes. Those moments of friction are the most valuable learning experiences OOP has to offer if you pay attention to them instead of just hacking around them.

Read your own code after you write it and ask yourself if someone who had never seen this project could understand what each class is responsible for just from its name and its public methods. If the answer is no then the design needs work. Clarity of responsibility is one of the clearest signs of good OOP design and it is something you can evaluate in your own code even as a beginner.

Be patient with this one. OOP is one of those things that genuinely clicks in layers. You will think you understand it and then build something more complex and realize there was another layer you had not reached yet. That is completely normal and it happens to everyone. Keep building, keep reflecting on your designs, and keep asking why things are structured the way they are in code you read and study. The understanding deepens over time if you stay curious and deliberate about it.

reddit.com
u/Nearby-Way8870 — 5 days ago

I keep forgetting syntax — does this happen to everyone or is it just me?

Genuinely asking this because it is starting to mess with my confidence and I want to know if this is a normal part of the process or a sign that something is wrong with how I am studying.

I have been learning Python for about five months now. I feel like I understand the logic behind most of what I have covered. I get why loops work the way they do, I understand how functions are structured, I follow the reasoning behind object oriented programming even though it took me a while to click. The concepts are not the problem. The problem is that I will walk away from a session feeling solid and then come back two days later and forget whether it is a colon or a parenthesis, blank on the exact syntax for a list comprehension, or second guess myself on something as simple as how to open and read a file properly.

It is not like I forget everything. It is more like the details get fuzzy and I end up checking my own notes or going back to documentation constantly even for things I have written out ten times before. That part bothers me because it makes me feel like nothing is actually sticking long term.

What I have tried is writing things out by hand, keeping a personal notes document, and building small projects to practice what I learn. It helps while I am in the middle of it but the retention between sessions is still inconsistent.

My question is whether this is something that just fades naturally the more you code or whether there is a specific way people actually train themselves to remember syntax without having to look it up every single time. And for working developers out there, how often are you still referencing documentation on the job even for languages you have been using for years.

Honest answers only, I can take it.

reddit.com
u/Nearby-Way8870 — 5 days ago

Why does my loop keep printing the same value instead of updating?

First sem CS student here, taking Intro to Programming and we just got into loops and functions. I feel like I understand the concept when the professor explains it in class, but the second I sit down to write it myself everything falls apart.

So here is what is happening. I am writing a simple program that is supposed to go through a list of numbers and print each one. But no matter what I do, it keeps printing the first value over and over instead of moving through the list. I have been staring at this for like two hours and I genuinely cannot figure out what I am missing.

This is basically what I have:

numbers = [10, 20, 30, 40]
i = 0
while i < len(numbers):
    print(numbers[0])

I know it is probably something small and obvious but I cannot see it. I checked the course slides and they do not really explain what happens when the loop does not move forward. Is there something specific I need to add to make the loop actually advance to the next item? Any explanation would really help me understand what is going on under the hood, not just a fix.

reddit.com
u/Nearby-Way8870 — 5 days ago

This is something I went through myself and I want to talk about it openly because I think it is one of the most common and most discouraging experiences a beginner can have and nobody really explains what is actually happening when you feel this way. You watch a tutorial, you read the chapter, you follow along and everything makes complete sense in the moment. Then you close the material, open a blank file, and your mind goes completely empty. It feels like you learned nothing and that feeling is brutal. But here is the thing, it does not mean you are bad at this. It means you have been learning passively and passive learning and actual coding ability are two completely different things.

What you are experiencing has a name in learning psychology and it is called the illusion of competence. When you follow along with someone else solving a problem your brain gets the signal that you understand it because you are keeping up and nothing is confusing you. But keeping up with someone else's solution is fundamentally different from generating a solution yourself. Recognition is not the same as recall. Watching someone else cook a meal and being able to cook that meal yourself are not the same skill even if you followed every step closely. The same principle applies directly to coding.

The reason this gap exists in programming specifically is because writing code from scratch requires you to hold multiple things in your head at the same time. You need to remember the syntax, think through the logic, structure the program correctly, and anticipate problems before they happen all at once. When you are following a tutorial someone else is handling all of that cognitive load and you are just along for the ride. The moment you sit down alone all of that load lands on you at once and it feels overwhelming because you have never actually carried it before.

So the fix is not to watch more tutorials or read more documentation. The fix is to deliberately put yourself in situations where you have to generate code without a safety net and do that repeatedly until it becomes uncomfortable in a productive way rather than a paralyzing way.

Here is a concrete approach that actually works. After you finish any tutorial or lesson close it completely. Do not have it open in another tab. Then try to rebuild what you just learned from scratch using only your memory. You will get stuck. That is the point. The places where you get stuck are exactly the gaps in your actual understanding rather than your recognition. Write down what you could not remember, go back and look at just that specific thing, then close it again and keep going. This method feels slower but it builds real retention instead of the illusion of it.

Another thing that helps is taking any concept you just learned and applying it to a completely different problem than the one in the tutorial. If the tutorial used a loop to print numbers one to ten then you write a loop to do something totally unrelated. A simple inventory counter, a basic temperature converter, anything. Transferring a concept to a new context is what proves you actually own that concept rather than just recognizing it in its original form.

Start writing small programs with no tutorial guidance at all as early as possible even if they are ugly and inefficient. A number guessing game. A basic calculator. A program that takes a list of names and sorts them. The quality does not matter at this stage. What matters is that every line came from your own thinking. Every bug you hit and fix in your own code teaches you more than ten tutorials watched passively.

Also stop measuring your progress by how much material you have covered and start measuring it by what you can build without help. Those are very different metrics and for a long time they feel out of sync. You can cover a lot of material and still not be able to build much on your own. The second metric is the one that actually matters for becoming a developer.

The gap you are feeling right now is not a sign that you cannot do this. It is a sign that you have been in consumption mode when you need to shift into production mode. That shift is uncomfortable at first because it means sitting with confusion and pushing through it instead of having someone guide you through it. But that discomfort is exactly where real skill gets built. Lean into it instead of avoiding it and you will start seeing real progress faster than you expect.

reddit.com
u/Nearby-Way8870 — 10 days ago

how do you actually get good at problem solving in coding?

This is something I have been thinking about seriously for the past few months and I want to have an honest conversation about it because I think a lot of beginner content out there gives very surface level advice on this topic. Things like "just practice more" or "do leetcode every day" without actually explaining the process behind getting better at breaking down and solving problems. So let me share what has actually helped me and open it up for people with more experience to add to it.

First thing I had to accept was that problem solving in coding is a skill completely separate from knowing syntax. You can memorize every method in a language and still freeze up when you sit down with a problem you have never seen before. Syntax is just the tool. Problem solving is knowing how to use that tool when the situation is unfamiliar. Those two things develop on different tracks and you have to train both intentionally.

The biggest shift for me came when I stopped trying to jump straight to writing code the moment I read a problem. That instinct to just start typing is actually one of the worst habits a beginner can have. Before I write a single line now I make myself slow down and go through a process. I read the problem twice. I write out in plain English what the input is, what the output should be, and what the relationship between them is. Then I think about edge cases. What happens if the input is empty. What happens if there are duplicates. What happens if the numbers are negative. Thinking through those scenarios before touching code saves enormous amounts of debugging time later.

After I understand the problem I try to solve it manually first without any code at all. I pick a simple example and I walk through it step by step in my head or on paper like I am the computer executing the instructions. This sounds slow but it forces you to think in the logical sequence that code actually follows. A lot of bugs come from people writing code based on a fuzzy mental model of what should happen. Working through examples manually sharpens that model before you start writing.

Then I write out my approach in plain language or pseudocode before touching actual syntax. Something like first loop through the array, check each element against the condition, if it matches store it in a new list, return the new list at the end. That kind of rough outline. Getting the logic right in plain language first means when I start writing real code I am just translating, not figuring out the logic and the syntax at the same time. Trying to do both simultaneously is what causes beginners to get stuck and frustrated.

Another thing that helped me significantly was changing how I handle being stuck. My old approach was to stare at the problem for twenty minutes getting increasingly frustrated and then look up the answer. That teaches you almost nothing. My current approach is to give myself a real focused effort, maybe thirty to forty five minutes of genuine attempts where I try different angles, then if I am still stuck I look at just enough of a hint to get unstuck, not the full solution. Then I close the hint and finish it myself. Then after I solve it I look at other solutions and ask myself why they made the choices they made. That review step is where a huge amount of learning happens and most people skip it entirely.

Consistency matters more than volume here. Doing two or three problems a day with full focus and real reflection is worth more than grinding ten problems while half paying attention. You want to be present and deliberate every time you sit down, not just racking up a number.

The other piece that does not get talked about enough is building your pattern recognition over time. A lot of coding problems are variations of a smaller set of core patterns. Two pointer techniques, sliding windows, nested loops for comparisons, recursion for problems that break into smaller versions of themselves. As you solve more problems you start recognizing which pattern fits which type of problem and that recognition is what experienced developers have that beginners do not yet. It takes time to build but it does build if you are paying attention and reflecting on what you solve.

Be patient with yourself on this. Problem solving ability feels like it grows slowly and then suddenly. There will be stretches where you feel like nothing is clicking and then one day you will sit down with a problem that would have destroyed you two months ago and you will work through it cleanly. That moment is real and it comes if you stay consistent and stay intentional about how you practice.

reddit.com
u/Nearby-Way8870 — 10 days ago

This is a question I see come up a lot and I want to give a thorough answer because I think a lot of beginners get steered in the wrong direction on this one and end up frustrated wondering why nothing makes sense. The short answer is yes, you absolutely need to learn core Java before touching Spring Boot. But let me break down why that actually matters instead of just leaving it at a yes.

Spring Boot is a framework. What that means is it is built on top of Java and it exists to make certain things faster and easier when building backend applications. Things like setting up a web server, connecting to a database, handling HTTP requests and responses, managing dependencies. Spring Boot handles a lot of that complexity for you automatically.

And that sounds great on the surface but here is the problem. When something breaks and something will always break you need to understand what is happening underneath the framework to fix it. If you do not have a solid grasp of core Java you will be staring at error messages that mean nothing to you and copying solutions from the internet without understanding why they work.

Spring Boot uses a concept called dependency injection heavily. That is baked into everything you do with it. If you do not understand classes, objects, interfaces, and how Java manages object creation and relationships then dependency injection will feel like complete magic and not the good kind. It will feel like things are happening for no reason and you will not be able to reason about your own code.

Annotations are another big part of Spring Boot. Things like at RestController, at Service, at Autowired, at Entity. These annotations are doing real work behind the scenes and understanding what they are actually doing requires you to have a working mental model of how Java applications are structured. Without core Java knowledge you are just copying patterns without understanding them and that ceiling hits you hard the moment you try to build anything beyond a basic tutorial project.

So what specifically should you know in core Java before starting Spring Boot. You need to be comfortable with object oriented programming. Classes, objects, constructors, inheritance, interfaces, and encapsulation. You need to understand exception handling because Spring Boot applications throw exceptions constantly and you need to know how to catch and handle them properly.

You need to know collections, meaning ArrayList, HashMap, and how to work with lists of data. You need to understand basic input and output and how data flows through a Java program. You should also have some exposure to generics because you will see them constantly in Spring Boot code.

You do not need to be a Java expert before starting Spring Boot. Nobody is saying you need five years of Java experience first. But you should be at a point where you can write a Java program from scratch, work with classes and objects comfortably, handle errors, and understand what the code you are writing is actually doing line by line.

A realistic checkpoint would be this. If you can build a small Java project on your own without following a tutorial step by step, something like a simple inventory system or a basic contact manager using classes and collections, then you are probably ready to start exploring Spring Boot. If you cannot do that yet then stay in core Java a little longer. The time you invest there will pay off massively once you get into the framework because everything will make so much more sense from day one.

Rushing into Spring Boot without that foundation is one of the most common mistakes beginners make and it leads to people giving up because they feel like they are not smart enough when the real issue is just that they skipped steps. Do not skip the steps.

reddit.com
u/Nearby-Way8870 — 27 days ago

I have been sitting on this question for a while now and I think it is time to just put it out there and hear from people who have actually lived through this decision one way or the other.

Here is my situation. I am 27, born and raised in New York, and I have been seriously considering a full career switch into software development for about the past eight months. I currently work a completely unrelated job, I do not have a CS degree, and going back to a four year university full time is not realistic for me right now financially or logistically. A part time or online program might be possible but I have not ruled anything out yet.

What I have done so far is spend the last four months learning Python on my own. I am past the basics, I understand functions, loops, object oriented programming, and I have built two or three small projects that actually do something even if they are not impressive by any real standard. I feel like I am making progress but every time I start feeling good about it I run into a job listing that says bachelor's degree in computer science or equivalent required and my confidence takes a hit.

What I genuinely want to understand is how much that degree requirement actually matters in practice. Are companies putting that in listings as a hard requirement or is it more of a wish list that they are willing to move past for someone with a strong portfolio and demonstrable skills. I know the answer probably varies by company size and type but I want to hear real experiences not just general takes.

I also want to know what self taught developers who got hired actually did to make themselves competitive without the degree. Was it certifications, open source contributions, personal projects, networking, or some combination of all of it. And for people who did get the degree, do you feel like it gave you a real technical edge or was it mainly just a door opener that got your resume past the first filter.

I am not looking for someone to tell me what I want to hear. If the degree matters that much in today's market I want to know that straight up so I can figure out whether a part time program makes more sense than grinding self taught for two years and hitting a wall at the application stage. And if self teaching is genuinely viable I want to know what that path actually looked like for the people who pulled it off.

Real experiences only. I can handle honest answers either way.

reddit.com
u/Nearby-Way8870 — 27 days ago

I want to settle this question for myself once and for all because the internet has given me about fifteen different answers and I am more confused now than when I started.

I am 26, based in New York, and I am making a serious commitment to get into software development. Not casually poking around anymore, I mean actually putting in the hours every day with a real goal of being employable within the next year and a half to two years.

Before I commit to a language and go deep on it I want to make sure I am not picking something that looks good on paper but does not actually move the needle when it comes to getting hired in the US market specifically.

Here is where my head is at right now. Python seems like the most beginner friendly and it shows up everywhere from web development to data science to automation to AI related work. JavaScript feels unavoidable if I want to build anything visual on the web and pretty much every job listing I look at mentions it in some form. Java and C++ keep coming up in computer science conversations but they feel more academic than practical for someone trying to get hired without a degree.

I am not locked into any specific area yet. I am open to web development, backend work, data, or really anything with solid job demand in the US market right now. I do not have a CS degree and I am fully self taught so whatever I pick needs to have strong learning resources available and an actual hiring pipeline attached to it.

For people already working in the industry or who got hired recently, what would you genuinely tell someone in my position to start with and why. Not what sounds good theoretically, what actually gets people jobs right now.

reddit.com
u/Nearby-Way8870 — 28 days ago

Alright so I've been getting this question a lot from people in my circle who are just starting out and I figured I'd put together something useful for the community here based on what actually works. If you are a complete beginner and Java is your first or second language this is a realistic breakdown of where to start and how to build a solid foundation without overwhelming yourself in the first two weeks.

First thing to understand is that Java is a statically typed object oriented language. That sounds intimidating but what it means practically is that Java is strict about how you write code and that strictness actually teaches you good habits early. You will need to set up your environment before you write a single line. Download the JDK which stands for Java Development Kit and install an IDE. IntelliJ IDEA Community Edition is free and it is genuinely the best option for beginners because it catches errors as you type and helps you understand what went wrong before you even run your code. Eclipse is another option but IntelliJ is smoother for most people starting out.

Once your environment is set up your first goal is to get comfortable with the absolute basics. Start with understanding how a Java program is structured. Every Java program starts with a class and a main method and that is your entry point. Get that pattern locked into your head early because you will see it constantly.

From there work through these topics in order and do not skip ahead. Data types and variables come first. Then operators and expressions. Then conditionals meaning if else and switch statements. Then loops meaning for, while, and do while. Then methods and how to pass arguments and return values. Then arrays and basic data structures. Only after you are comfortable with all of that should you start moving into object oriented programming concepts like classes, objects, inheritance, encapsulation, and polymorphism.

A mistake a lot of beginners make is jumping into OOP before they have the fundamentals locked in. That is how you end up confused and discouraged. Build the foundation first and OOP will make a lot more sense when you get there.

After you have a handle on OOP basics start writing small projects on your own. A simple calculator, a number guessing game, a basic student grade tracker. Nothing fancy. The point is to apply what you learned without following a tutorial step by step. That gap between watching someone else code and writing it yourself is where the real learning happens and you need to close that gap as early as possible.

From there you can start exploring more Java specific concepts like ArrayList, HashMap, exception handling, and basic file input and output. These come up constantly in real projects and in job interviews so do not treat them as optional extras.

The most important thing I can tell any beginner is to write code every single day even if it is just for thirty minutes. Reading and watching tutorials has its place but nothing replaces actually sitting down and writing code, breaking it, fixing it, and understanding why it broke. That cycle is how you actually get good.

Be patient with yourself. Java has more ceremony than Python and the learning curve feels steeper at the start but it levels out. Stick with it consistently and the language starts feeling natural faster than you'd expect.

reddit.com
u/Nearby-Way8870 — 28 days ago

This is a question I've been sitting with for a while and i figure this community would give me a more realistic answer than anything I'd find on a generic blog post. I started learning Java about three weeks ago after spending a few months with Python, and I'm trying to set honest expectations for myself instead of chasing some unrealistic timeline.

Right now I understand the basics. Variables, data types, conditionals, loops, and I've started getting into methods and classes. Object oriented programming is starting to click but I won't pretend I have it fully figured out yet. Things like inheritance, interfaces, and abstraction still feel a little shaky when I try to apply them to actual problems instead of just reading about them.

What I want to know is what does "good at Java" actually look like in stages. Like what's a realistic checkpoint at three months, six months, one year if you're putting in consistent daily effort. I'm not trying to become a senior engineer overnight. I just want to know when i can realistically start building real projects, when i can start applying for entry level roles, and when the language stops feeling like I'm fighting it every time I sit down to code.

I'm putting in roughly one to two hours on weekdays and more on weekends. I'm working through structured material and also trying small coding challenges on the side. No bootcamp, fully self taught.

If you've gone through this process with Java specifically I'd love to hear how your timeline actually played out. Not the polished version, the real one.

reddit.com
u/Nearby-Way8870 — 30 days ago

This has been frustrating me for months and I finally want to talk about it because i know i can't be the only one dealing with this.

I will watch a tutorial or follow along with a lesson and everything makes complete sense while I am doing it. I understand what each line does, i follow the logic, i feel like i actually get it. Then I close the tutorial, open a blank file, and try to build something on my own and my mind just goes completely empty. Like i am staring at a white screen with no idea where to even begin. It does not feel like I forgot anything, it feels like I never actually learned it in the first place.

I have been learning Python for about four months now. I have gone through conditionals, loops, functions, lists, dictionaries, and i just finished a section on object oriented programming. On paper i should be able to sit down and build something small. But every time I try i either freeze up immediately or i get two lines in and hit a wall and end up back on Google searching for answers to things I feel like I should already know.

What I have tried so far is rewatching sections when i feel lost and taking notes while i code along. It helps in the moment but does not seem to stick when I go solo.

Is this a normal phase that most people go through or am i actually missing something fundamental in the way i am studying. And if you got past this wall, what actually changed for you. What made things finally click where you could sit down and just build without needing someone to hold your hand through every single step.

reddit.com
u/Nearby-Way8870 — 30 days ago

I want to have an honest conversation about this because every answer i find online feels either way too optimistic or weirdly vague.

Bootcamps say you can be job ready in 12 weeks. YouTube videos talk about landing a dev job in 6 months. Then i go on here or other forums and read people saying it took them 2 years of consistent work before they even got their first interview callback. That is a massive range and I genuinely do not know what to believe.

I am currently about 3 months into learning Python. I do roughly an hour to two hours a day on weekdays and a little more on weekends when life allows it. I understand basic syntax, functions, loops, conditionals, and i just started getting into object oriented programming which is honestly where things started feeling harder. I have not touched data structures and algorithms yet and i know that is a whole other mountain to climb before even thinking about interviews.

What i want to know is this. For people who went through this process and actually got hired, how long did it realistically take from writing your first line of code to getting a real job offer? Not an internship, not freelance, an actual salaried position. And what did your schedule actually look like during that time?

I am not trying to rush it. I just want a real target to work toward so i can plan properly instead of grinding with no finish line in sight. The honest stories, good or bad, are what i am looking for here.

reddit.com
u/Nearby-Way8870 — 1 month ago

So I've been trying to figure out where to put my energy and this question keeps coming up in my head. I've played around with Python a little bit and honestly it felt pretty approachable. The syntax is clean, it reads almost like plain English, and I was able to get small things working without feeling completely lost. But now I'm considering adding Java to my learning path and people keep warning me that the jump is significant.

I want to understand what the actual difference feels like in practice. Not just the surface level stuff like "Python has less boilerplate" because I've read that a hundred times already. I mean the real day to day experience of learning Java coming from a Python background. Does the strictness of the type system genuinely slow you down at the beginning or does it end up teaching you better habits? Does object oriented programming in Java feel overwhelming when you're still getting your fundamentals down?

For context I can write basic Python scripts, I understand loops, functions, and have started touching classes. I'm not a complete beginner but I'm definitely not advanced either. My goal is to eventually get into backend development and I keep seeing Java come up alongside Python for that path depending on the industry.

I'm not scared of a challenge. If Java takes longer to get comfortable with but makes me a stronger programmer overall then that tradeoff makes sense to me. I just want honest input from people who have learned both so I know what I'm actually walking into before I commit time to it.

reddit.com
u/Nearby-Way8870 — 1 month ago

This is something I've been genuinely curious about for a while now. I'm still early in my learning journey and I keep hearing that Java is old, that newer languages have taken over, and that nobody wants to touch it anymore. But then I look at big companies, banks, insurance companies, large tech firms, and they are all still running massive systems built on Java. So what is actually going on here?

Like I get that switching an entire codebase is not a small thing, but these companies have money and resources. If Java was truly inferior they would have moved on by now, right? There has to be more to it than just "it's too expensive to switch."

Is it the performance? The maturity of the ecosystem? The fact that there are millions of Java developers in the workforce so hiring is easier? I honestly don't know enough yet to answer my own question and that's why I'm asking.

I'm currently learning backend basics and Java keeps coming up as something worth understanding if I want to work in enterprise or fintech down the line. Trying to figure out if that reputation is earned or if it's just legacy inertia keeping it alive.

Would love to hear from people who actually work in environments where Java is the main language. What does it do well that keeps companies locked in? And do you see that changing anytime soon or is Java just built into the foundation of how large scale software gets done?

reddit.com
u/Nearby-Way8870 — 1 month ago

Genuine question and I want real answers, not just the usual "learn to code bro" speech people were giving out five years ago.

I keep seeing two completely opposite takes everywhere. One side says AI is going to wipe out entry level dev jobs in the next few years and learning to code now is basically training for a position that won't exist. The other side says developers who understand AI tools are going to be more valuable than ever and coding knowledge is what separates someone who uses these tools well from someone who just gets average results out of them.

I'm a 24 year old guy from New York, been thinking about getting into software development seriously for about a year now. I haven't committed to anything yet because this exact question keeps stopping me cold. Feels like every time I'm about to start I read something that makes me pump the brakes.

Here's what I actually want to know. Not theory, not predictions, just what people in the field are seeing right now. Are junior developers still getting hired? Are companies actually cutting dev teams because of AI tools or are they just using those tools alongside their existing engineers? And if you're someone who codes professionally today, do you feel like your skill set is becoming less relevant or more?

I'm not scared of putting in the work. I just don't want to spend two years grinding through something and come out the other side into a job market that moved on without me. That's the honest truth of where I'm at.

What do you all actually think?

reddit.com
u/Nearby-Way8870 — 1 month ago

Honest question because I keep going back and forth on this. I've been learning programming for about eight months now, mostly self-taught, and I keep seeing people say Java is dying or that nobody uses it anymore. But then I look at job boards in NYC and there are still a solid number of Java backend roles posted every week, so I'm genuinely confused about where it actually stands right now.

I'm not trying to pick a language just because it's trendy. I want something with real job prospects and a strong foundation that helps me understand programming deeply. Java came up as a recommendation from a few people in my circle who work in software, but online I keep seeing takes that make it sound like a waste of time in 2026.

For context I've been working through basic data structures and have touched a little Python and some JavaScript. Java feels more structured and strict which honestly I kind of like because it forces me to understand what I'm doing.

So for anyone who has been in the industry for a while or is currently working with Java professionally, is it still a smart move to go deep on it? Is it being replaced in most companies or is it holding strong in certain areas like enterprise, Android, or backend systems? Would appreciate real talk from people with actual experience, not just the usual "learn what you love" advice.

reddit.com
u/Nearby-Way8870 — 1 month ago