Coding General Discussion

Biggest problem I have with Indian IT Industry is that - "There is no even playing field"

Its all based on luck and contacts! It is always said learn new things and excel and all opportunities will flock to you.
However reality is far from that. Barely 20-30% people get promotions and On-site based on your real performance. Largely it is based on how good your connections are with your higher ups.

There is no transparency in the appraisal processes of the Big Companies. Even if you get highest rating you might end up with a raise lower than a Category 2 or Category 3 rating person.

I have seen multiple rating 3 guys going onsite and getting promoted to higher roles.

You might end up working day and night for decades and wont be considered at all for any promotion but a newbie with just 1-2 year exp will go on-site and enjoy the life!

Harsh Reality!
That's also very true.
Some or most companies follow bell-curve for appraisal, meaning some population has to be fit in on lowest sides of bell not getting any salary hike, band progression, even if the entire population is good. This is heights of stupidity & the managers & HRs bluntly say "you are in lowest bucket, if you don't improve, you may be demoted or fired" :rage: :wtf: :shoot::smiley-crying: Idiots are working on outsourcing basis, not making globally competing H/w & S/wbut blaming us.
Filling in appraisal portal looks like essay in philosophy. :facepalm4::faint::fyeah:
I have always prayed not to be sent abroad bcoz i'm simple, satvik person🙏🟢. That way i can't climb up the ladder much, but that's ok for low IQ people like me:eric:. The heirarchy is triangular, everybody cannot get promoted, anywhere.
But everybody's pay grade needs to rise by at least govt. FD rates to beat inflation. Also, automation, AI, robotics, etc will keep reducing human jobs. 🤖
I'm speculating that if AI, automation is not regulated then IT industry will become like Armed forces SSC (Short Service Commision). Most IT guys will have to retire by 40-45 age in future. This will impact technical education colleges also.
People like me who hate PCM work in system admin & tech support dept. will also have to adapt by diversifying skills. Basically we will have to upskill with everything present in datacenter - storage, switches, routers, firewalls, servers, virtualization, cloud, database, etc.☁️🔥
 
Can you explain what is in the raw data file?

If you mean the file in the network folder, that's where I am saving the architecture, weights, biases and other stuff(activation functions/etc) of the neural network at the moment. This pretty much allows me to train a network, save it to disk and then reload it in memory in the case I want to save some configuration or want to port it over. When I get to doing CNNs it's going to take a lot of time so I don't want to build a network and train it from scratch every time, loading it directly makes it easier.

If you use a hex viewer to debug it, I've laid down all the parts in sections(stored in little-endian):
- Architecture(number of neurons per layer, unsigned int32)
- Weights(for each neuron sequentially: float64)
- Bias(float64)
- Activation type(sigmoid/relu/tanh, float32)
- Learning rate(float32)

Between each section is the 4 byte sequence of 0x55AA0000 that marks the end of that section(fileops.c in src/ contains functions that interact with it. It's called from nn.c in the save_network() function)

Can you tell me from where i can pick knowldege about neural network and deep learning basic to advanced ? I want something simple and lucid to understand for simple dumb idiot like me.
I'ma be real, a month or two ago I didn't know what the frick a neural network was either lol :troll:

I pretty much made this watching 3blue1brown's youtube videos on neural networks(great channel!), reading about perceptrons(which is what this project originally was: just able to learn linear mathematical relations like y = 2x).

Here's a bunch of articles I found useful:

https://www.kaggle.com/code/ryanholbrook/a-single-neuron - a good explanation on perceptrons
https://xnought.github.io/backprop-explainer/ - great intro to backpropagation

Perceptrons are relatively easy to implement - they are kind of like PID loops. When I made a single input, single weight neuron, I basically gave it input data, and some expected output data. The difference between the actual output and expected output made me change the weight of the neuron - hence it automatically "centers" or corrects itself to the desired output.
 
Last edited:
If you mean the file in the network folder, that's where I am saving the architecture, weights, biases and other stuff(activation functions/etc) of the neural network at the moment. This pretty much allows me to train a network, save it to disk and then reload it in memory in the case I want to save some configuration or want to port it over. When I get to doing CNNs it's going to take a lot of time so I don't want to build a network and train it from scratch every time, loading it directly makes it easier.

If you use a hex viewer to debug it, I've laid down all the parts in sections(stored in little-endian):
- Architecture(number of neurons per layer, unsigned int32)
- Weights(for each neuron sequentially: float64)
- Bias(float64)
- Activation type(sigmoid/relu/tanh, float32)
- Learning rate(float32)

Between each section is the 4 byte sequence of 0x55AA0000 that marks the end of that section(fileops.c in src/ contains functions that interact with it. It's called from nn.c in the save_network() function)


I'ma be real, a month or two ago I didn't know what the frick a neural network was either lol :troll:

I pretty much made this watching 3blue1brown's youtube videos on neural networks(great channel!), reading about perceptrons(which is what this project originally was: just able to learn linear mathematical relations like y = 2x).

Here's a bunch of articles I found useful:

https://www.kaggle.com/code/ryanholbrook/a-single-neuron - a good explanation on perceptrons
https://xnought.github.io/backprop-explainer/ - great intro to backpropagation

Perceptrons are relatively easy to implement - they are kind of like PID loops. When I made a single input, single weight neuron, I basically gave it input data, and some expected output data. The difference between the actual output and expected output made me change the weight of the neuron - hence it automatically "centers" or corrects itself to the desired output.
Thanks a ton mate
 
Yes the Rust thing indeed seems like a fud.
It will meet the same fate as Ruby, Dart, Scala.

Do you worry that due to the massive influx of people into the tech industry, we will face a serious employment crisis soon?

I mean i'm seeing colleges are shutting down other branches to accommodate seats for CSE students,on top of that almost all non-IT branch people are also trying to get into the sector.
 
It is already happening as we speak.
Friends with 3 to 4 years of experience in Amazon are being laid off.
Entire teams are being dissolved.
IT being a unregulated sector take employees for granted and people are hired and fired on whim.
After 2021 there was a exceptional hiring frenzy which led to over saturation. The industry demand didn't catchup and russian ukriane war tanked retail consumerism worldwide.
Currently the covid hires are being laid off, you don't hear about them because most company offer 5 months salary if they quit themselves vs the 3 months if the company laid them off.
 
I very strongly disagree with your opinion regarding Rust replacing C++. Memory management was the biggest issue with C/C++. With the advent of smart pointers/STL allocators, that's no more of an issue.

In my field, C++ is king and will remain so. It allows far too much freedom for low level programming compared to others.

I didn't say C++ will vanish. But Rust will be used where it is more convenient. C++ is more complex and bloated language. Even the creator of C++ will not be aware of many features. C++ will still be dominant in finance and gaming. But Rust will find a place in the sun for sure.
 
Do you worry that due to the massive influx of people into the tech industry, we will face a serious employment crisis soon?

I mean i'm seeing colleges are shutting down other branches to accommodate seats for CSE students,on top of that almost all non-IT branch people are also trying to get into the sector.
For now, there is still dearth of good Senior Engineers. Finding good Senior Backend Engineers has been a chore for me. Ask the candidates about a simple connection pool and they are like "hUH ? mE usE RAW JDBC".
So, if you are a good senior engineer, there are still plenty of jobs around and you can even pick and choose.
 
Since we're talking of open source contribution, I'm making my own neural network from scratch in C:

If anyone wants to contribute, feel free to do so! I've tried to document stuff as much as possible. I'm able to scale up the neurons/layers in a generalized fashion, and after I solve the problem of efficient I/O with respect to trainining/testing data, I should be able to train this on the MNIST dataset

read on reddit that this is how they teach machine learning at mit . get good idea of how machine learning works by building one from ground up .
 
I've ordered i5-7500T M710Q barebone mini PC at 7K 'cause I got some SSD and 2*4 Sodimm in the drawer.
Might build some Cloud storage and Media consumption server and shitz. wish luck for the noob me.

idle and peak consumption 5-8w / 25w. should not hurt my pocket.
 
I've ordered i5-7500T M710Q barebone mini PC at 7K 'cause I got some SSD and 2*4 Sodimm in the drawer.
Might build some Cloud storage and Media consumption server and shitz. wish luck for the noob me.

idle and peak consumption 5-8w / 25w. should not hurt my pocket.
He document your journey on this thread with cost of components. Would like same. Also suggest me Good Compact CPU with Cuda support for Generation AI. I dont know latter but want to execute some projetcs on Github.
 
He document your journey on this thread with cost of components. Would like same. Also suggest me Good Compact CPU with Cuda support for Generation AI. I dont know latter but want to execute some projetcs on Github.
CUDA is proprietary software, runs only on NVIDIA GPUs.

Regarding creating Generative AI, you need some serious power. A Titan series GPU. A typical node in a HPC cluster has 8 or so A100/H100s with dozens of nodes. This sort of compute power is needed for generative AI.
 
CUDA is proprietary software, runs only on NVIDIA GPUs.

Regarding creating Generative AI, you need some serious power. A Titan series GPU. A typical node in a HPC cluster has 8 or so A100/H100s with dozens of nodes. This sort of compute power is needed for generative AI.
Not really, a consumer grade RTX gpu with 8GB or more VRAM can run Stable Diffusion models and Llama-7B llm models with ease.
 
I didn't say C++ will vanish. But Rust will be used where it is more convenient. C++ is more complex and bloated language. Even the creator of C++ will not be aware of many features. C++ will still be dominant in finance and gaming. But Rust will find a place in the sun for sure.

No memory managed program will ever break through the performance/memory/cache optimization barrier, hence Rust is not going to compete against the C/C++ ecosystem. Rust is good but it is not a complete system programming language like the C/C++.

As far as bloating is concerned, one can use C++ without STL, just use the basic language features and the system calls. Most of the C programs are valid C++ programs. Hence, one doesn't need to master every C++ feature. Unfortunately, it takes at least a decade to become an expert in C/C++ because it is very easy to write dangerous and buggy programs in those languages.

C/C++ are natural languages for the implementation of the POSIX standards. Which means that Rust will never be used for writing the OS kernels. In fact it is not easy to write the system calls in C++ due to the name mangling (an OOP feature :)) issue and that is where the good old C programming language shines.

Rust will not vanish though, it has got it's utilities like Java, Python, Scala and JavaScript to count a few.
 
The feedback I am getting from many people is that learning only Python is insufficient,


Many times for different projects you have to learn new language from scratch but then since you know the basics you pick it up fast.

You learn python but learn it deep, then it will be easier for you to pick up new language.
 
It is already happening as we speak.
Friends with 3 to 4 years of experience in Amazon are being laid off.
Entire teams are being dissolved.
IT being a unregulated sector take employees for granted and people are hired and fired on whim.
After 2021 there was a exceptional hiring frenzy which led to over saturation. The industry demand didn't catchup and russian ukriane war tanked retail consumerism worldwide.
Currently the covid hires are being laid off, you don't hear about them because most company offer 5 months salary if they quit themselves vs the 3 months if the company laid them off.
A lot of resources who switched companies during covid times and got 2-3 times their initial packages are getting laid off in masses.
Industry is back to normalization of crazy packages given during that time.

These resources are put on bench and are asked to get a project which can support their billing themselves.

The packages given were absurd. A three year experience .Net resource ended up getting 15-16 lac package. The resource was ver average and struggle to deliver anything individually.
Finally got laid off few months back and is now running his father's kirana shop as he is not employable anywhere in IT.

Heard 15-20 cases so far in my extended circles.
 
That's also very true.
Some or most companies follow bell-curve for appraisal, meaning some population has to be fit in on lowest sides of bell not getting any salary hike, band progression, even if the entire population is good. This is heights of stupidity & the managers & HRs bluntly say "you are in lowest bucket, if you don't improve, you may be demoted or fired" :rage: :wtf: :shoot::smiley-crying: Idiots are working on outsourcing basis, not making globally competing H/w & S/wbut blaming us.
Filling in appraisal portal looks like essay in philosophy. :facepalm4::faint::fyeah:
I have always prayed not to be sent abroad bcoz i'm simple, satvik person🙏🟢. That way i can't climb up the ladder much, but that's ok for low IQ people like me:eric:. The heirarchy is triangular, everybody cannot get promoted, anywhere.
But everybody's pay grade needs to rise by at least govt. FD rates to beat inflation. Also, automation, AI, robotics, etc will keep reducing human jobs. 🤖
I'm speculating that if AI, automation is not regulated then IT industry will become like Armed forces SSC (Short Service Commision). Most IT guys will have to retire by 40-45 age in future. This will impact technical education colleges also.
People like me who hate PCM work in system admin & tech support dept. will also have to adapt by diversifying skills. Basically we will have to upskill with everything present in datacenter - storage, switches, routers, firewalls, servers, virtualization, cloud, database, etc.☁️🔥
Never underestimate and lowball yourself. IQ is not any measure to judge worth of any person.

Try to find things which you understand more easily and focus on learning something everyday.
 
Never underestimate and lowball yourself. IQ is not any measure to judge worth of any person.

Try to find things which you understand more easily and focus on learning something everyday.
I'm not lowballing myself, just being humble & accepting the truth that i am not meant for hi-fi PCM based jobs. Every tech job doesn't require that.
Hence sys.admin. & tech support jobs are enough for me which also requires repeated new study of fast evolving networking, storage , server tech.
 
I'm not lowballing myself, just being humble & accepting the truth that i am not meant for hi-fi PCM based jobs. Every tech job doesn't require that.
Hence sys.admin. & tech support jobs are enough for me which also requires repeated new study of fast evolving networking, storage , server tech.
All jobs are jobs!
The area of expertise differs but everything has importance. Sys Admin and Tech jobs are there so that the STEM or PCM resources can perform.
You are enabling them to do meaningful work.
 
The future is digital. Everything converges on a data-center - Email, VoIP, Video sites, ATM, E-commerce, net-banking, digital payment, Sat-TV, public CCTV surveillance, R&D machines, Big Data, Hospital patient records, any office employee records, etc.

So people like me who want to be techie but not with PCM can do following study & give exams -






 

Latest Replies

Featured Content

Trending Threads

Donate via Bitcoin - bc1qpc3h2l430vlfflc8w02t7qlkvltagt2y4k9dc2

qrcode
Back
Top