arisuchan    [ tech / cult / art ]   [ λ / Δ ]   [ psy ]   [ ru ]   [ random ]   [ meta ]   [ all ]    info / stickers     temporarily disabledtemporarily disabled

/λ/ - programming

structure and interpretation of computer programs.

formatting options

Password (For file deletion.)

Help me fix this shit.

Kalyx ######

File: 1508200233260.jpg (36.95 KB, 318x474, 51HQ7J6CVJL._SX316_BO1,204….jpg)


Hello. II'm curious about the language Fortran and how many of you use it or know it. I'm going to purchase a few books on it since I'm interested in learning it for myself.
8 posts omitted. Click reply to view.


Don't worry, I wasn't planning on learning '77. I think I'll pick up a book sometime soon on 2008 though, it seems really nice,


Consider 2016, since it is the latest version.


Woops, I'll do that instead then.


I got my first job becaue I knew FORTRAN 77 (in 2011). Not good to learn in isolation, sure, when you could learn a newer verson, but FORTRAN 77 and even FORTRAN IV are still pretty valuable skills if you want to program in nuclear/aerospace.


if you know procedural programming,
you basically know fortran.

File: 1508102184635.jpg (35.4 KB, 800x800, usbttl.jpg)


how do I into USB programming?
1 post omitted. Click reply to view.


Here is a great resource to start out with:


I'd say check out the osdev wiki. There should be something about USB drivers on there.



What do you guys want to do? Write Drivers? Send Data? Receive input from your own HW? Hack your things (e.g. with Wireshark)?

My first foray into USB was when I tried to connect a Nintendo NES Pad to the PC via Arduino. It's really easy and you get to write a simple driver that translates the 8 Buttons to Keyboard presses, which then can be used inside Emulators or wherever you want. This was many years ago though so I don't have that tutorial anymore, but this will teach you some of the basics.


One good project is a USB keyboard or mouse. You don't need very much of the standard to make it work so it is quite approachable.

I managed to get a working mouse just by wiresharking what my bytes my real mouse was sending and then copying that.


downlaod realterm
its a windows app but it allows you to watch and rewrite stuff thru usb. pretty fun if ua sk me. im sure theres other linux stuffs 2

File: 1519955916707.gif (3.97 MB, 958x548, grimes.gif)


Does language A being turing complete imply that it is possible to write another turing complete language in language A?


If (by "write a language", you mean "write a compliler or interpreter"){ yes; }


Turing complete means that it is possible to write anything that is possible in any other Turing complete language.



And this is way Conway Game of Life can be implemented in Conway Game of Life.

File: 1518798213628.jpg (11.38 KB, 250x208, 15108604501500s.jpg)


HI! Im programming and diy when i bored. i used to python, C++ and soldering iron. What kind of project should i do? I was in procrastination for few months and im a bit scared now


Do nice stuff that you would actually use, even if you end up cloning existing projects. It all depends on what your interests are.

File: 1518025665185.gif (1.75 MB, 492x270, 1484086522935.gif)


I decided to learn Python. Idk why I haven't decided to use the official tutorial, I found some plain tutorial on Python online (I don't need theory on how stuff work, I know C already) and decided to go with that. It was okay until I noticed some flaws in site itself, then errors in code and finally this.

Chapter about loops, if statement, break/continue. Prime numbers. This guy says that 1 is a god damn prime number.

Fuck this, I'm gonna to go to official docs. No more soykafty FAQs or tutorials.
3 posts omitted. Click reply to view.


Where do you live? I know in Germany we have a lot of translated books/ books written in German.

I learned Python partly by myself with the help of some books from nostarch press (if you like the pdfs I can share them with you). They're very good at explaining stuff in my opinion, the rest I learned in classes.


I live in Poland, there are some books translated (like O'Reilly ones, and some by No Starch Press too)


File: 1518183962598-0.pdf (5.38 MB, pythoncrashcourse.pdf)

ah, Poland. I assume you know English well enough to learn from English textbooks?

A book by no starch I can recommend is pythoncrashcourse, it gives you a good and solid first step into python, yet it lacks some parts of it. It's good for a beginner to get a 'feel' for it.

I uploaded it for you, I hope you like it and maybe you can get it in Polish :3


I know English well enough, I'm already doing the python tutorial from official docs

It's availble in Polish, but I don't think it will be money well spent if official tutorial is good enough


File: 1518379718288-0.pdf (6.6 MB, Data Structures and Algori….pdf)

File: 1518379718288-1.pdf (8.17 MB, OReilly Introducing Python….pdf)

File: 1517870508617.png (39.09 KB, 763x768, 763px-FOSDEM_logo.svg.png)


The recordings from this year's FOSDEM are slowly being uploaded. There were so many interesting talks I don't even know where to start.

Let's have a thread for discussing the talks we watched!

Here's the website, the recordings are linked from the individual talk's pages:
1 post and 1 image reply omitted. Click reply to view.


Yes, all of them were recorded, more than 600 talks of various lengths. You can track the state of the various videos here, as you can see only one talk has been lost so so far:

So far I've watched these:
The many ways of using Guix packages
This is an introduction to Guix, which is a functional package manager similar to Nix, written in Scheme. It presents some common use cases and a high level overview of how it works. Guix is really cool!

Tying software deployment to scientific workflows
This is another Guix talk, mainly aimed at high performance computing for scientific purposes. It shows how Guix can be used to tie together the many parts of the usual scientific computing workflows and how it could help with making research reproducible.

DIY Java Static Analysis
This is a very short talk about symbolic execution in Sonarqube. It was a bit of a disappointment because the topic is really interesting but he didn't have much time for it. It could have been much better if he went deeper into the details because the topic is very interesting.

JavaParser: where should we head?
I really liked this one. It's about a parser for Java, which doesn't sound that interesting at first. First he shows how to parse code with it and how you can also generate code using the same objects. It gets exciting when he starts talking about how you can also query the source codPost too long. Click here to view the full text.


Best part of FOSDEM in general was Nvidia vs AMD. Can fully relate to

"FOSDEM summary

NVIDIA: Nouveau developers discussing their limitations and struggles dealing with delayed drops of signed firmwares and features they are unable to support without Nvidia's help

AMD: actual AMD staff presenting their latest open source Vulkan driver, their committment to maintaining their drivers in the upstream kernel with day-one support for new products, and inviting the community to contribute"


That Nouveau talk was so sad.


Personally, best part of FOSDEM for me is n-gate's reactions on it.

Apart from that I'm interested in FPGA/CPLD/SoC talk, however there are no good tools on both free and commercial sides.


File: 1516752991252.jpg (104.98 KB, 1280x720, maxresdefault.jpg)


hi lains,
i´m working at a small software company. they developing a software which could help people if it would be open source. so i´m asking me how could i leak and what should i leak to the public? what files are needed for reverse engineering?
6 posts and 1 image reply omitted. Click reply to view.


File: 1516871378623.jpg (74.62 KB, 1024x768, f194df6a73bf64d98f929605d4….jpg)

dood don't leak, it's not worth it because of legal repercussions unless you're looking forward to having your horizons and anus expanded in jail or something

if the software actually is useful start a free software clone under pseudonym, gather people that would benefit from the project to help you. you can cheat and look at original closed source program to help you get the design right or to get unstuck with eventual hard problems. your employer won't try legal actions as long as they don't suspect the project owner is one of their employees, and you only risk breaching employment contract by working on competitor product rather than outright leaking trade secrets
also the free software version would have the potential to eventually become better than the original, and having a free version available is way better than a leaked proprietary one.


They wrote that it could help people if it was open source.


Whatever it is, if you saiy if it were open source,it would help people, you should definitely leak it. Even if distributing it would be impossible within large corporations, things like AUR and hidden service repositories will always keep a copy, plus there's bittorrent.


op again.

i could try to get the source from build server. just found a little hole in security.

the next step: to whom i could leak it? do you know any engineering/tech collectives?


Wookieleaks and all imageboards you can think of.

File: 1516263179192.gif (1005.36 KB, 500x357, lain navi.gif)


How do i get a complete understanding of computers and programming in general?
My second question is how do i rebuild my mathematics education for programming and cs?
What books should i read, im looking for advice and resources.

Im currently trying to learn x86 assembly through a book "programming from the ground up".
I really want to understand how computers work inside and out.
1 post omitted. Click reply to view.


File: 1516359844532.jpg (119.31 KB, 537x736, 5549b396b499b79e554ca3ec59….jpg)

That's a lot to cover. You will probably want to start with digital electronics and build your way up from there. I think Code: The Hidden Language of Computer Hardware and Software and The Elements of Computing Systems are two books that try to go through all the abstraction layers but I've not read either so I can't tell you how good they are.


Anything is appreciated. I really want to understand it. my curent plan is to learn c and assembly for starters. I also have to rebuild my mathematics. Any resources is appreciated, any advice. learning strategies, ebooks, anything u guys can give me.


File: 1516428679503-0.epub (8.76 MB, [Stephen_Kochan]_Programm….epub)

File: 1516428679503-1.pdf (8.49 MB, RE4B-EN-real.pdf)

You could pick up Stephen Kochan's Programming in C 4th ed. or the K&R
RE for Beginners
another link :
The CS:APP/3e is kind of hard to find unless you want some really soykafty pdf photocopies
also for math I would just recommend Khan academy to refresh yourself


File: 1516449444405.png (566.06 KB, 1048x783, 1506967986.png)

I can't really give you concrete resources, I studied most of these at university in my native language. But I'm sure it's not hard to find good books for everything, if you are in trouble you can always look up university courses, they often publicly list the textbooks they use, using those is a good start.

The topics I would cover:
- digital circuits
- coding theory
- embedded systems (FPGAs, Verilog/VHDL, etc.)
- embedded programming (RISC assembly, embedded operating systems, etc.)
- computer architecture (CPUs, memory, HDDs, periphery connections)
- operating systems
- networking

I think this would cover most of the "complete understanding of computers." Of course you don't have to go very deep into each, you could spend a lifetime studying any of these, just understand the general principles and main ideas.

For the programming part I could give something similar for the practical parts but I'm not that familiar with the actual theory behind it.


File: 1516842396366.jpeg (109.33 KB, 1008x720, lain wireed.jpeg)

Thanks guys, i really appreciate it

File: 1496880422799.jpeg (143.71 KB, 638x826, stack_smash.jpeg)


In binary exploitation, what level of knowledge should one have on the C programming language? In my case I am interested in windows exploitation, but on any level how much should one know?

Understanding the way memory and processing works with compile C programs I know is essential, but should I also be able to develop full programs in C? If not than what advantages can knowing C well give?

Thank you.


You should know the whole language. It's not big, so don't worry. It's also helpful to have some kind of idea of how the code is going to look like once compiled, and being able to write code that could have been the source of an assembly listing is handy too.


>but should I also be able to develop full programs in C?
That's a vague question, here's a full program:
#include <stdio.h>

int main() {
    printf("Hello, world\n");
    return 0;

On the other hand, often C programs depend on algorithms and such, which are probably out of the scope of binary exploitation. So you probably don't need to be able to write a text editor to be able to exploit a C program.
But I do think you should learn C to some level, that way, as >>205 said, you can experiment writing-compiling-disassembling and also using a RE tool (IDA Pro or whatever) to get used to exploring the inner structure of a C program.
However, there are many things unrelated to the implementation language that are also important, like the layout of a binary file in memory, the environment variables it inherits, and so on. Where I'm going is, since C was developed for Unix, it has many Unix-ism which I wouldn't know how they map in windows (file descriptors for example), so besides a basic understanding of C, you should probably focus on how Windows deals with it.


decent grasp of C and know about memory and debugging programs.
you can try this :


Get proficient in C, learn what all the functions in the standard library do, learn how memory works in detail (heap allocation, stacks etc.). Learn about all the UB etc. in the C language. Since you mentioned Windows, learn the common Windows API functions (there are a lot of them).

Learn x86 assembly (assuming that's your target platform). You don't need to know every instruction by heart, but you should have a decent grasp of how the architecture works. Decompile C programs and reverse-engineer them. Get familiar with your debugger of choice (I use GDB, but that probably isn't ideal for Windows use).

Go to - Read the slides, and do the challenges in the VM image.

Play CTFs.

Congrats, you are now a binexp expert in 4 easy steps!


Bear in mind that modern binaries have protections, so make sure to learn how those protections work and how to workaround them once you have the basis nailed down.

File: 1515351919214.jpg (50.3 KB, 768x384, spectre_meltdown.jpg)


So I was reading through the example code in the Spectre vulnerabilty paper and found this line here:
/* Avoid jumps in case those tip off the branch predictor */
x = ((j % 6) -1) & ~0xFFFF;   /* Set x=FFF.FF0000 if j%6==0, else x=0 */

How does this work exactly? I'm still kind of new to programming so I didn't know that this was possible.
If you want context you can also find it here all the way at the bottom, on line 57.

Also general Spectre/Meltdown thread I guess…

Other resources: (Computerphile's Video)


They wanted the code to be sequential operations so the branch predictor wouldn't be influenced. If they used
or the ternary conditional operation, it would be compiled into jumps, which influence the branch predictor.

Now let's see how it works.
(j % 6)

This gives j modulo 6, the result will be 0 if j can be divided by 6 and between 1 and 5 otherwise.
((j % 6) -1)

This subtracts one from the modulo. If j can be divided by 6 the result will be -1, which is
in hexadecimal (computers use two-complement to represent numbers). Otherwise it will be between 0x00 and 0x04.
((j % 6) -1) & ~0xFFFF

means negating 0x0000FFFF which means flipping the bits so it becomes 0xFFFF0000. The bit-wise and (
) of this and the result of the previous expression will be 0xFFFF0000 if it was -1 (divisible by 6) and 0x00000000 otherwise.

Which is what they wanted, but without the jumps. There are no "if this, then that", just a sequence of mathematical operations, so the branch predictor, which is only concerned with conditional execution won't be involved. It's a bit cryptic at first but if you know all the operations it is actually pretty straightforward.

I used 32 bit long words in this post, for 64 bit processors just double the length of the values.

Delete Post [ ]
[1] [2] [3] [4] [5] [6] [7] [8] [9] [10] [11] [12]
[ Catalog ]