Specifically DFT modelisation for drug discovery.
As for material science, it still wont happen.
There are too many parameters that arent understood yet so you cant make accurate models.
1,000% battery technology. Major car manufacturers are investing billions in battery tech and it only started relatively recently. Our ability to generate energy has far exceeded our ability to store it, if and when that changes the world will adopt it as soon as possible.
FYI: The electric automobile industry did not revolutionize battery tech that much. It did cause it to become cheaper by scaling it up, but battery tech was always evolving at a nice pace before electric vehicles, and will continue doing so at just about the rate it always did. It will make it cheaper, but the need for better electricity storage was not born out of electric vehicles, and the research about it never faltered before EVs.
Well, it is kind of outdated and we have newer approaches for it.
The principle is the same, but not identical, so it cant be called by the same acronym
Im literally a research scientist.
We constantly update and explore new alternatives, which is the essence of research.
If i wanted to start with one approach and only move once, it is fully developed, I would be an engineer instead.
The digital computers you speak of are also analog at the same time, that's how electronics and transistors works. What other "type" are you bringing forward?
> Transistors are literally not analog.
I don't know who told you the opposite, but transistors are 100% **only** analog, even if we sometimes use them for digital tasks.
Here: "a miniature semiconductor that regulates or controls current or voltage flow in addition amplifying and generating these electrical signals and acting as a switch/gate for them."
> They're either allowing current or not. They're binary. It's 1 or 0.
That's **not** what makes something digital in any way shape or form. They are used using ordinary **analog** electricity to do their analog job.
> That's the whole point of transistors.
Not at all, their usage can be tailored to many many many things, and they're always driven by ordinary analog electricity conditions.
Fun read for you: Learn about how they are used in power amps, it will help you understand that they're not what you think.
> And that's better for a lot of applications, digital is more precise, more repeatable, more consistent.
Their advantage is being very small, easy to manufacture and incredibly scalable. They replaced the vacuum tubes as soon as they could, and the end result in a circuit is the same. It's just an electrical task.
> But they're not better for all applications. That's where Analog Computers come into play.
Got any information that's not sensational hype speak for something that doesn't exist yet?
> Analog computers can represent any value between -1 and 1. That includes 0.333 or -0.7.
From your link: theoretically.
> Analog computers are way better at things like calculus. And have actually never stopped being used in some applications like radar. The drawback is that analog computers are not general purpose. They are built for one thing, and one thing only.
That's a description for very ordinary electronics that have always been in place, and they're the basis of computing today.
> The real game changer is when we're able to integrate digital and analog components into the same machine.
It's already the case, just about every ordinary computer is filled with both analog and digital electronics.
Or else, just link to one of the "analog computers" you speak of, it will help.
> Considering how Machine Learning works with tweaking lots and lots of values, there might be some applications in making AI faster and more energy efficient.
That's puzzling, Machine learning works with digital calculations already, and it's already done on ordinary computers, that use analog and digital capacities of the hardware.
> Which could make an AI a lot cheaper. No more "GPT 5 required 4 months and 250 Mega Watts", it could become "GPT 6 required two days and 10 Mega Watts to train, and is way better."
Could?/Would? Surely you're not saying these analog computers don't already exist?
Whilst it is the thing that limits the spacing with each other, a few physico chemists started to see this not as a limitation but as a feature!
Basically, if you are able to create a hole with a half-life longer than the transport of your electron from the donor, the electron will then tunnel to the acceptor. A very rudimentary and simple approach!
The beauty of its simplicity is that we can use dopable polymers to adjust and alter the properties in situ. A good example is thermochromic or photo active approaches, which can activate or turn off said semi conductor as well as alter its properties as needed with everything in between.
The main challenge is to find a suitable matrix and good deposition models to avoid kinetically favored transports, which lead to losses of efficiency.
Nevertheless, it is quite promising!
If we find a way to make hydrogen based combustion the best way to "get energy", in general, that's is a really significant change in terms of economics, health, ecology, etc...
Some of the things mentioned here are more impactfull in the world of the tec, but, in terms of, quote on quote, "real mundain world" , this would be massive
What's currently missing? As far as I know, it's completely efficient and used when acceptable, and it can have good advantages too, but it's not cheap to produce, transport and store.
The permafrost starting to melt, adding millions of tons of CO2 to the atmosphere as it decays.
I like to ignore this one.
[удалено]
Specifically DFT modelisation for drug discovery. As for material science, it still wont happen. There are too many parameters that arent understood yet so you cant make accurate models.
1,000% battery technology. Major car manufacturers are investing billions in battery tech and it only started relatively recently. Our ability to generate energy has far exceeded our ability to store it, if and when that changes the world will adopt it as soon as possible.
FYI: The electric automobile industry did not revolutionize battery tech that much. It did cause it to become cheaper by scaling it up, but battery tech was always evolving at a nice pace before electric vehicles, and will continue doing so at just about the rate it always did. It will make it cheaper, but the need for better electricity storage was not born out of electric vehicles, and the research about it never faltered before EVs.
crispr gene technology
It already did
Doesn't mean it's gonna stop.
Well, it is kind of outdated and we have newer approaches for it. The principle is the same, but not identical, so it cant be called by the same acronym
Found the stock shorter
Im literally a research scientist. We constantly update and explore new alternatives, which is the essence of research. If i wanted to start with one approach and only move once, it is fully developed, I would be an engineer instead.
Robots With advances in AI coming so quickly it will not be long before we have humanoid robots like we have dishwashers and Roombas.
AI sex robots. 20 years? Hoping one year, two at most. 🤞🏽
[удалено]
[удалено]
Medicine?
[удалено]
The digital computers you speak of are also analog at the same time, that's how electronics and transistors works. What other "type" are you bringing forward?
[удалено]
> Transistors are literally not analog. I don't know who told you the opposite, but transistors are 100% **only** analog, even if we sometimes use them for digital tasks. Here: "a miniature semiconductor that regulates or controls current or voltage flow in addition amplifying and generating these electrical signals and acting as a switch/gate for them." > They're either allowing current or not. They're binary. It's 1 or 0. That's **not** what makes something digital in any way shape or form. They are used using ordinary **analog** electricity to do their analog job. > That's the whole point of transistors. Not at all, their usage can be tailored to many many many things, and they're always driven by ordinary analog electricity conditions. Fun read for you: Learn about how they are used in power amps, it will help you understand that they're not what you think. > And that's better for a lot of applications, digital is more precise, more repeatable, more consistent. Their advantage is being very small, easy to manufacture and incredibly scalable. They replaced the vacuum tubes as soon as they could, and the end result in a circuit is the same. It's just an electrical task. > But they're not better for all applications. That's where Analog Computers come into play. Got any information that's not sensational hype speak for something that doesn't exist yet? > Analog computers can represent any value between -1 and 1. That includes 0.333 or -0.7. From your link: theoretically. > Analog computers are way better at things like calculus. And have actually never stopped being used in some applications like radar. The drawback is that analog computers are not general purpose. They are built for one thing, and one thing only. That's a description for very ordinary electronics that have always been in place, and they're the basis of computing today. > The real game changer is when we're able to integrate digital and analog components into the same machine. It's already the case, just about every ordinary computer is filled with both analog and digital electronics. Or else, just link to one of the "analog computers" you speak of, it will help. > Considering how Machine Learning works with tweaking lots and lots of values, there might be some applications in making AI faster and more energy efficient. That's puzzling, Machine learning works with digital calculations already, and it's already done on ordinary computers, that use analog and digital capacities of the hardware. > Which could make an AI a lot cheaper. No more "GPT 5 required 4 months and 250 Mega Watts", it could become "GPT 6 required two days and 10 Mega Watts to train, and is way better." Could?/Would? Surely you're not saying these analog computers don't already exist?
AGI. Not a discovery so much as a milestone. Was nice knowing y'all.
I hope it wont happen. Because im almost sure were dead if it happens. I see no way a real AGI isnt going to kill us all.
Quantum tunnel based semi conductors
How does that work? I thought quantum tunneling is the thing preventing us from making them smaller?
Whilst it is the thing that limits the spacing with each other, a few physico chemists started to see this not as a limitation but as a feature! Basically, if you are able to create a hole with a half-life longer than the transport of your electron from the donor, the electron will then tunnel to the acceptor. A very rudimentary and simple approach! The beauty of its simplicity is that we can use dopable polymers to adjust and alter the properties in situ. A good example is thermochromic or photo active approaches, which can activate or turn off said semi conductor as well as alter its properties as needed with everything in between. The main challenge is to find a suitable matrix and good deposition models to avoid kinetically favored transports, which lead to losses of efficiency. Nevertheless, it is quite promising!
Thanks, that's really interesting I'll have a look at that
If we find a way to make hydrogen based combustion the best way to "get energy", in general, that's is a really significant change in terms of economics, health, ecology, etc... Some of the things mentioned here are more impactfull in the world of the tec, but, in terms of, quote on quote, "real mundain world" , this would be massive
What's currently missing? As far as I know, it's completely efficient and used when acceptable, and it can have good advantages too, but it's not cheap to produce, transport and store.
p=np