• HDU - 1078 FatMouse and Cheese(记忆化搜索DP)


    HDU - 1078 FatMouse and Cheese

    Problem Description
    FatMouse has stored some cheese in a city. The city can be considered as a square grid of dimension n: each grid location is labelled (p,q) where 0 <= p < n and 0 <= q < n. At each grid location Fatmouse has hid between 0 and 100 blocks of cheese in a hole. Now he’s going to enjoy his favorite food.

    FatMouse begins by standing at location (0,0). He eats up the cheese where he stands and then runs either horizontally or vertically to another location. The problem is that there is a super Cat named Top Killer sitting near his hole, so each time he can run at most k locations to get into the hole before being caught by Top Killer. What is worse – after eating up the cheese at one location, FatMouse gets fatter. So in order to gain enough energy for his next run, he has to run to a location which have more blocks of cheese than those that were at the current hole.

    Given n, k, and the number of blocks of cheese at each grid location, compute the maximum amount of cheese FatMouse can eat before being unable to move.

    Input
    There are several test cases. Each test case consists of

    a line containing two integers between 1 and 100: n and k
    n lines, each with n numbers: the first line contains the number of blocks of cheese at locations (0,0) (0,1) … (0,n-1); the next line contains the number of blocks of cheese at locations (1,0), (1,1), … (1,n-1), and so on.
    The input ends with a pair of -1’s.

    Output
    For each test case output in a line the single integer giving the number of blocks of cheese collected.

    Sample Input
    3 1
    1 2 5
    10 11 6
    12 12 7
    -1 -1

    Sample Output
    37

    #include<iostream>
    #include<cstring>
    using namespace std;
    const int N = 110;
    const int dx[4]={-1,0,1,0},dy[4]={0,1,0,-1};
    int g[N][N],ans[N][N];
    int n,k;
    int dfs(int x,int y)
    {
    	if(ans[x][y]) return ans[x][y];
    	else
    	{
    		int sum=0;
    		for(int i=0;i<4;i++)
    			for(int j=1;j<=k;j++)
    			{
    				int xx=x+j*dx[i],yy=y+j*dy[i];
    				if(xx>=1&&xx<=n&&yy>=1&&yy<=n&&g[xx][yy]>g[x][y])
    					sum=max(sum,dfs(xx,yy));
    			}
    		ans[x][y]=sum+g[x][y];
    		return ans[x][y];
    	}
    }
    int main()
    {
    	while(cin>>n>>k)
    	{
    		if(n==-1&&k==-1) break;
    		memset(ans,0,sizeof ans);
    		for(int i=1;i<=n;i++)
    			for(int j=1;j<=n;j++)
    				cin>>g[i][j];
    		cout<<dfs(1,1)<<endl;
    	}
    	return 0;
    }
    
    • 1
    • 2
    • 3
    • 4
    • 5
    • 6
    • 7
    • 8
    • 9
    • 10
    • 11
    • 12
    • 13
    • 14
    • 15
    • 16
    • 17
    • 18
    • 19
    • 20
    • 21
    • 22
    • 23
    • 24
    • 25
    • 26
    • 27
    • 28
    • 29
    • 30
    • 31
    • 32
    • 33
    • 34
    • 35
    • 36
    • 37
  • 相关阅读:
    2023年最新ADB工具箱R34下载-自带驱动常见ADB命令刷机ROOT神器
    Java异常处理笔记
    吴恩达《机器学习》8-3->8-4:模型表示I、模型表示II
    深度学习(PyTorch)——长短期记忆神经网络(LSTM)
    【Rust】环境搭建
    探析ElasticSearch Kibana在测试工作中的实践应用
    CSS 3之背景属性
    论文阅读 Exploring Temporal Information for Dynamic Network Embedding
    SQL Server数据库——创建数据库
    cpu设计和实现(取指)
  • 原文地址:https://blog.csdn.net/qq_52792570/article/details/125602624