我正在尝试为包含单词的文件内容分配内存(用:\n分隔)。
如何替换 16000 以使其可用于更大尺寸的文件?
我的代码:
typedef struct node {
bool is_word;
struct node* children[27];
} node;
node* root;
bool load(const char* dictionary)
{
FILE *fp;
fp = fopen(dictionary, "rb");
node* node_bucket = calloc(16000, sizeof(node));
node* next_free_node = node_bucket;
// compute...
// to later free the memory with another function
root = node_bucket;
}
谢谢
最佳答案
您可以动态分配内存,而无需知道文件有多大。我使用的 block 大小是 2 的幂,这通常对 block I/O 更有利。当最后一个 block 仅部分使用时会浪费一点,但这里有一个示例,您可以对其进行调整以与节点结构一起使用:
#include <stdio.h>
#include <stdlib.h>
#define BLOCKSIZE 16384
int main(void) {
unsigned char *buf = NULL;
unsigned char *tmp = NULL;
size_t totalread = 0;
size_t currentsize = 0;
size_t currentread = 0;
FILE *fp;
if((fp = fopen("test.txt", "rb")) == NULL)
exit(1);
do {
currentsize += BLOCKSIZE;
if((tmp = realloc(buf, currentsize)) == NULL)
exit(1);
buf = tmp;
currentread = fread( &buf[totalread], 1, BLOCKSIZE, fp);
totalread += currentread;
} while (currentread == BLOCKSIZE);
printf("Total size was %zu\n", totalread);
free(buf);
fclose(fp);
return 0;
}
关于c - 如何确定文件(所有内容)的大小,以便我可以立即为其分配内存?,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/36494654/