arisuchan    [ tech / cult / art ]   [ λ / Δ ]   [ psy ]   [ ru ]   [ random ]   [ meta ]   [ all ]    info / stickers     temporarily disabledtemporarily disabled

/λ/ - programming

structure and interpretation of computer programs.

formatting options

Password (For file deletion.)

Help me fix this shit.

Kalyx ######

File: 1511649036195.png (32.46 KB, 600x600,


Reasons to learn C

C has a significant amount of software written in it, including the Linux kernel. Learn C if you ever want to hack on the kernel, any UNIX-based operating system, device drivers, or other embedded / firmware / device-level software.

It has two major compilers (gcc and clang), with gcc being the oldest and most popular and clang being younger but built on a better technical foundation. This is in addition to a high volume of static and dynamic analysis tools. Learn C if you like languages with lots of tool support available.

C is very good for writing simple single-threaded programs that go fast with low memory overhead. Learn and use C when you want more explicit control over your program's memory usage and the operations it performs, e.g., for implementing searching and sorting algorithms.

Learn C if you ever want to do any kind of reverse engineering or executable binary analysis. It's likely the thing you'll be analyzing will have been written in C or something similar.

Learn C if you want to get a good software engineering job. There are lots of opportunities for both maintaining old C code and writing new C code.

Things that aren't reasons to learn C

Plenty of people might disagree with me on these, but I feel like there's a lot of misconceptions about what C is good for and what it isn't good for. Here's some things that it's not so great for.

Don't learn C for writing performant programs. Earlier I mentioned that C is good for writing simple programs with low memory overhead and great single-threaded performance. Anything more complex than this starts to get nontrivial very fast. This is because C has generally poor and difficult to use concurrency tools (complicated multithreading, synchronization, and atomic primitives that take a lot of effort to use correctly).

Don't learn C for writing anything complex that also needs to be correct. Proving the correctness of a C program is more or less impossible without first writing it in a second, more formal language. C is also full of undefined behavior, which is pretty easy to avoid, but you have to know what to look out for.

Don't learn C if your real goal is to learn C++. Just learn C++ in that case. C and C++ are different languages with very different best practices.

Don't learn C to make it easier to learn other languages later. IMO it's better to start with a functional language like a LISP or haskell than a procedural language like C.

Other things to watch out for

C has no garabge collection, which can be a good thing or a bad thing, depending on your needs. Be careful about putting things on the heap with functions like malloc, since you are responsible for freeing that memory!

C has a very limited understanding of bits, which can actually make it somewhat difficult to write device-level or networking programs where each bit is important. Use bitfields with care.

Compared to other languages, C has a very empty standard library. Any datastructure more complicated than a character array you'll likely end up implementing yourself.

How to learn C is the best starting point for finding resources for learning C. It will tell you what's good and what's not good. It's maintained by the fine folks on the ##c freenode IRC channel, which is also full of very knowledgable people. They take C very seriously.


File: 1513464219711.png (93.72 KB, 640x480, 1512107547259.png)

is C shell the same as C?
How do I change my shell from bash to C?
How does C use memory?


>is C shell the same as C?
No, they're very different environments in every aspect, they just have similar syntax, albeit not identical due to the inherent differences between them.
>How do I change my shell from bash to C?
man chsh
I would suggest not doing that, though. Read more about it here:
It is also not the default shell in most unix systems, and you'd do good learning the bourne shell first. Actually, I think it's a good thing not to overlap shell syntax with that of C, again, because they are so different from each other in every way. Learn the basic architecture of Unix instead, to make optimal use of both languages.
>How does C use memory?
That's at the core of C, so I couldn't give a good answer in a post. You need to learn C. All I can tell you is that while most other languages do the memory management for you, in C you have to be explicit in allocating chunks of memory and freeing that space when you're done using it.
However there is much more to it than just that, that's why you need to read a good resource (or several) to really understand what's going on.


File: 1513671695989.png (154.64 KB, 640x480, 17-12-19_19-20-19-Untitled….png)

What did he mean by this?


> Reasons to learn C

Embedded Jobs are less soykaf than other coding jobs.


Any good resources on secure C programming? I mean stuff like avoiding race conditions when using standard file functions, writing re-entrant code & catching integer overflows, out-of-mem errors and more.


Look up the functions you're using on popular C reference sites, they will usually have security notes. Bottom line is don't trust the user.


>How does C use memory?

C has some support for automatic memory management. All declared variables have the memory backing their values managed in this way. When you exit scope, the variables are automatically destroyed along with the stack frame backing them.

    /* Automatic variables "foo" and "bar" */
    int foo = 5;
    char *bar = "hello";

    /* Automatic variable "baz" with the value being an address
     * (pointer) to dynamic (not automatic) memory of size 10 */
    char *baz = malloc(10);
/* Outside of scope, foo, bar, and baz have been destroyed by C
 * automatically, but the size 10 dynamic memory is still around,
 * and now also impossible to free() since you don't have any
 * references to it. */


File: 1524899631243.pdf (962.6 KB, [Zed Shaw’s Hard Way Serie….pdf)

Having just gotten comfortable expressing my thoughts in Python, I now turned to learning C myself, I will upload a book from Zed A Shaw, the hard way guy, that I chose as my first book on C. I Hope some of you will find it useful.


great book. taught e more than any university professor ever could.


I suggest inspecting this page before picking up a book to learn C:
As you can see Zed A. Shaw's books are not recommended. More reasoning here:

Teaching material does not always reflect the quality of its content. C is not a Python, Java, C# kind of language you can learn from blog posts, tutorial videos and dumbed down "teaching" books. In the long run you are going hurt yourself, badly. You are going to utilize the underlying hardware inefficiently, you are going to write non-secure software, you will have an open invitation for security exploits. First programming book is an important choice because that's where you pick up most of your programming habits, and if they are bad ones (e.g. printf, malloc in C++ code) you are going to have headaches and through your code you are going to make bad habits continue.

Here is a good book on secure C/C++ programming:


File: 1529096385517.png (346.87 KB, 1600x956, 1807x1080_px_anime_Serial_….png)

All i have ever done is written code in python, ive been meaning to learn C but cant find a good source. Ive been recommended k&r by some but others told me its extremely outdated. Any recommendations? How should be C be for a python pseudocoder?


I've been learning C as of late, and I really like this site:


It's not mentioned on, but as a professional C programmer the first resource I turn to for technical reference is Have found all their documentation to be exceptional so far.


What Lain think about OpenBSD's ksh and KNF code style(9) ?


People who aren't weenies don't care about style so long as it makes sense and is used consistently. In general though the OpenBSD codebase is very well regarded.


> People who aren't don't
does anyone else get tired of these attack memes?


C the hard way was exactly what I was looking for, thank you alice
"C programming language" and others are for babies and I fell asleep reading them and didn't learn anything, but the hard way is all exercises which are practical


Yes. Alice requires better attack memes if we're going to win this blasted war.


>Don't learn C for writing anything complex that also needs to be correct. Proving the correctness of a C program is more or less impossible without first writing it in a second, more formal language. C is also full of undefined behavior, which is pretty easy to avoid, but you have to know what to look out for.
to be fair the static analysis and debugging tooling is very good because of C's age and prominence.
the point on formal analysis having to be written in a higher level language would be true of any language at this level, and again because of C's age there are several options when it comes to formal methods.

if you are learning C to get a job then you are living in the wrong decade.

another reason to learn C: it is easy to pick up. there isn't much to the language at all, hence the many pitfalls, but it certainly isn't a difficult language to get started with.

another book to add to the pile, Modern C by Gustedt available for free here:


if you are learning C to get a job then you are living in the wrong decade.


>if you are learning C to get a job then you are living in the wrong decade.

This is so wrong, not everyone wants a high level job. I can't tell if you're trying to keep the competition sparse or what, but there are plenty of great jobs where you will need to use C.


the market doesn't want newb C programmers.

if programming C is a serious career goal then take the enterprise route.


>if you are learning C to get a job then you are living in the wrong decade.
That's not true. Once you go down past a certain level in any stack then everything is written in C. Compilers, interpreters, system libraries, OS kernels, device drivers, firmware. Especially firmware.

Most modern C jobs are in embedded/firmware environments where direct access to hardware is required and speed, code space, cpu time, ram usage, energy consumption are issues. C is the only real choice.

>the market doesn't want newb C programmers.
My first job out of university was C. Admittedly the bar is higher though because C generally means serious business. You're not just trying to solve a single business problem, you always have to worry about at least a few of the things listed above (ram, code space, execution time etc) at the same time.

>I can't tell if you're trying to keep the competition sparse or what
Probably just ignorance. It's like an iceburg I guess. Most people are aware their kernel and device drivers are C but don't realize how many separate microcontrollers are in their laptops/phones nevermind how many millions of lines of firmware goes onto all of it.


Anyone here is doing unix systemprogramming with C? If so, what are you working on ?



What data structures and unit testing libraries do you guys use? I have been using glib. Its kind of clunky for testing.


File: 1566052945186.png (12.48 KB, 689x73, Capture _2019-08-17-11-02-….png)

Does the ternary question mark make it difficult to read code? I like it a lot because I don't need to use conditional blocks.


not at all. it is an elegant and clean way to save text in cases such as your example. personally I prefer a space on both sides of ':' but I digress.


Honestly, no. I'd really prefer if more people used ternary more rather than make 6 lines for an if-else statement


what makes it (slightly) hard to read is the indentation.
just be consistent. End of line should represent end of some idea.


I have this code:


#include <time.h>
#include <stdio.h>
#include <stdlib.h>

int myrand(int min, int max) {
    int new_value;
    static int previous_value;

    if(min > max) return 1;

    do {
        new_value = rand() % (max - min);
    } while(new_value == previous_value);

    previous_value = new_value;
    return new_value;

int main() {
    for(register char c = 0; c < 5; c++) {
        printf("%c\t", myrand(33, 126));

    return 0;

It works, but it's a little slow. How do I improve this?


It is sufficient to call
only once. You don't need to reseed before every call to [m]rand{/m], it won't improve the randomness.


>if(min > max) return 1;
You may as well get rid of this. there's no way to check if your function returned 1 because (min>max) or if it returned 1 as an actual result.
Your function takes a min and max value, but the value it returns isn't necessarily between those, is that a mistake? If so, you need to add min to new_value.
Also, your function will lock up if called with (min==max) twice in a row.


You could implement a faster rand(), unroll the for loop, call printf once instead of five times.

I don't think it's recommended to use the register keyword anymore; I'd assume it'd just do more to fuck up a register allocator than help it.


How do you do unit tests in C? Do people even do them?


#include <time.h>
#include <stdio.h>

int main() {
    int next = time(NULL);
    int rnd_chr = 0;
    for(char c = 0; c < 5; c++) {
        do {
            next = next * 1103515245 + 12345;
            rnd_chr = (unsigned int)(next/65536) % 127;
        } while (rnd_chr < 33);

    return 0;


 for(char c = 0; c < 5; c++) {
        do {
            next = next * 1103515245 + 12345;
            rnd_chr = (unsigned int)(next/65536) % 127;
        } while (rnd_chr < 33);


for(char c = 0; c < 5; c++) {
     next = next * 1103515245 + 12345;
     rnd_chr = (unsigned int)(((next/65536) % (127 - 33)) + 33);


not that it matters, but the do-while loop re-rolling for values under 33 reveals timing information and one could theoretically recover the next state.

what fun!


It's not a very good prng anyway, I would stick to calling
if it matters.



I think you meant:

        rnd_chr = 33 + (unsigned int)(next/65536) % (127-33);

Can't see much optimisation further beyond buffering rnd_chr and using puts.


#include <time.h>
#include <stdio.h>

int main() {
    int next = time(NULL);
    char buffer[11];
    for(char c = 0; c < 5; c++) {
        next = next * 1103515245 + 12345;
        buffer[c*2] = (unsigned char)33 + (unsigned int)(next/65536) % (127-33);
        buffer[c*2+1] = '\t';
    buffer[10] = 0;
    return 0;


Benchmarked 3 runs each.

Original code executed in 13892.115234 milliseconds.
Optimised code executed in 0.017000 milliseconds.

Thats a pretty good speed up. ;-P



It's probably faster to reduc

[Return] [Go to top] [ Catalog ] [Post a Reply]
Delete Post [ ]