arisuchan    [ tech / cult / art ]   [ λ / Δ ]   [ psy ]   [ ru ]   [ random ]   [ meta ]   [ all ]    info / stickers     temporarily disabledtemporarily disabled

/tech/ - technology


formatting options

Password (For file deletion.)

Help me fix this shit.

Kalyx ######

File: 1493318361421.jpg (31.72 KB, 619x464, cartao.jpg)


Inevitable in the sense that any civilization with technology approximating our own—before computers—would discover computation. This question is independent of whether their computers would be instantiated anything like those that we have designed or built. Another way to think about the question is to consider how our own history might have unfolded such that the invention of computers was accelerated or delayed. How early or late could have it been otherwise?


well, primary motivating push behind the development of computers was world war-ing, , etc. that can be said of just about everything that developed in the 1900s, though, so who knows.

the development of the concept itself seems like an inevitability, definitely, since computability theory came about as a simultaneous realisation by multiple parties, as have been most advances in mathematics and logic


File: 1493388692343.gif (10.07 KB, 946x261, s03f06.gif)

I am inclined to think yes, a method for automating computation, that is the calculation of some result.
Actually, while what you mean right now is digital computers, there have been analog computers for a long while now. Pic related is an example of an analog computer. But there are much older computers, such as the astrolabe, which has been around since the middle ages.
Charles Babbage conceived his differential engine well before the great war.


File: 1493391497709.png (60.6 KB, 670x291, o-o.png)

i took it to mean computation in the "as opposed to conversion" state, so a slide rule wouldn't count.

that is to say, "a computation device takes some form of input, maintains some internal state which is affected by the input, and produces some output which is affected by the state".

the jacquard loom, which everybody loves to mention because Punch Cards, also wouldn't count, as it produces it's output directly from the input: conversion.

i guess the ole' difference engine does count then, though; hadn't thought about that.


From information theory perspective conversion is computation. All CPU does is iterate "conversion" of the machine state, which is both input and output. Information theoretically computations differ only (?) by the number of bits they erase (those that erase some require energy to perform, those that don't - don't) and time they need to propagate signal from inputs to outputs.


File: 1493594347964.png (48.81 KB, 481x656, C.png)

i see it intuitively as a difference in input granularity. that is to say, any deterministic device will give exactly one lifetime-sum output for exactly one lifetime-sum input. that is to say, if you consider every bit of input a computer ever receives during its lifetime of operation as a single input (ignoring the effects of unintended inputs, like electronic interference or whatever), you could build an equivalent "conversion device" that produces the exact same output for the same input.

the difference comes when that input is broken down into smaller pieces. every continuous, start-aligned subset of this summed input could be considered in the same way, and thus emulated in the same way with a conversion device, but the complexity of a conversion device designed to perform all of these conversions would become the sum of the complexity of all those different subsets of the input (or you could optimise it down using currying, and maybe some other fancy things, sure, but that's not fundamentally different for what i'm getting at).

the point is that, as this summed input approaches infinite length over the lifetime of the machine, the complexity of an emulating conversion device would also necessarily approach infinity, while the complexity of the computation device never changes.

you can make a LUT which matches a function with infinite range, but only over some finite selection out of that range.

so yeh, that feels like an inherent difference to me.


Conisdering a computer is simply a device for computation and calculating numbers, yes I believe it would have been inevitable, as it is human nature to find ways to improve efficiency in whatever they wish to do, as to not have to do it. Out of laziness. Such as an Abacus.

You can also consider another person a "computer". In the past people used to take up jobs simply for the easy calculations, for things like ship's, etc. This can be seen through the 1640's definition of Calculator, "one who calculates".

[Return] [Go to top] [ Catalog ] [Post a Reply]
Delete Post [ ]