当前位置:网站首页>242. valid Letter heteronyms
242. valid Letter heteronyms
2022-07-01 03:43:00 【Sun_ Sky_ Sea】
242. Effective alphabetic words
Original title link :https://leetcode.cn/problems/valid-anagram/
Given two strings s and t , Write a function to determine t Whether it is s Letter heteronym of .
Be careful : if s and t Each character in the has the same number of occurrences , said s and t They are mutually alphabetic words .
Example 1:
Input : s = “anagram”, t = “nagaram”
Output : true
Example 2:
Input : s = “rat”, t = “car”
Output : false
Tips :
1 <= s.length, t.length <= 5 * 104
s and t Only lowercase letters
Their thinking :
Count the number of characters in the string , In the dictionary , Then judge whether the dictionaries are the same .
Code implementation :
class Solution:
def isAnagram(self, s: str, t: str) -> bool:
import collections
a = collections.defaultdict(int)
b = collections.defaultdict(int)
for c in s:
a[c] += 1
for c in t:
b[c] += 1
if a == b:
return True
return False
边栏推荐
- ECMAScript 6.0
- [TA frost wolf \u may- hundred people plan] 1.3 secret of texture
- How to display scrollbars on the right side of the background system and how to solve the problem of double scrollbars
- Unexpected token o in JSON at position 1 ,JSON解析问题
- 完全背包问题
- Explain spark operation mode in detail (local+standalone+yarn)
- C语言的sem_t变量类型
- GCC usage, makefile summary
- Blueprism registration, download and install -rpa Chapter 1
- [TA frost wolf \u may- hundred people plan] 2.3 introduction to common functions
猜你喜欢

RSN:Learning to Exploit Long-term Relational Dependencies in Knowledge Graphs

SEM of C language_ Tvariable type

完全背包问题

Pyramid Scene Parsing Network【PSPNet】论文阅读
![[deep learning] activation function (sigmoid, etc.), forward propagation, back propagation and gradient optimization; optimizer. zero_ grad(), loss. backward(), optimizer. Function and principle of st](/img/9f/187ca83be1b88630a6c6fbfb0620ed.png)
[deep learning] activation function (sigmoid, etc.), forward propagation, back propagation and gradient optimization; optimizer. zero_ grad(), loss. backward(), optimizer. Function and principle of st

后台系统页面左边菜单按钮和右边内容的处理,后台系统页面出现双滚动

Random seed torch in deep learning manual_ seed(number)、torch. cuda. manual_ seed(number)

Learning notes for introduction to C language multithreaded programming

Appium fundamentals of automated testing - basic principles of appium

【TA-霜狼_may-《百人计划》】2.4 传统经验光照模型
随机推荐
214. 最短回文串
multiple linear regression
Cookie&Session
Pytorch training deep learning network settings CUDA specified GPU visible
数据库中COMMENT关键字的使用
谷粒学院微信扫码登录过程记录以及bug解决
TEC: Knowledge Graph Embedding with Triple Context
不用加减乘除实现加法
[small sample segmentation] interpretation of the paper: prior guided feature enrichment network for fee shot segmentation
[deep learning] activation function (sigmoid, etc.), forward propagation, back propagation and gradient optimization; optimizer. zero_ grad(), loss. backward(), optimizer. Function and principle of st
【TA-霜狼_may-《百人计划》】2.3 常用函数介绍
Database DDL (data definition language) knowledge points
Ouc2021 autumn - Software Engineering - end of term (recall version)
FCN全卷积网络理解及代码实现(来自pytorch官方实现)
整合阿里云短信的问题:无法从静态上下文中引用非静态方法
242. 有效的字母异位词
Pyramid Scene Parsing Network【PSPNet】论文阅读
Asgnet paper and code interpretation 2
ASGNet论文和代码解读2
Go tool cli for command line implementation